diff --git a/README-CH.md b/README-CH.md index ffd2eeb38..ee8b70d8f 100644 --- a/README-CH.md +++ b/README-CH.md @@ -7,22 +7,22 @@ ## 简介 -    Visualis是一个基于宜信的开源项目[Davinci](https://github.com/edp963/davinci)开发的数据可视化BI工具。现已被集成到数据应用开发门户[DataSphere Studio](https://github.com/WeBankFinTech/DataSphereStudio)中,此次发布的版本Visualis1.0.0版本支持Linkis1.1.1和DSS1.1.0版本。 +Visualis是一个基于宜信的开源项目[Davinci](https://github.com/edp963/davinci)开发的数据可视化BI工具。现已被集成到数据应用开发门户[DataSphere Studio](https://github.com/WeBankFinTech/DataSphereStudio)中。 -    Visualis支持拖拽式报表定义、图表联动、钻取、全局筛选、多维分析、实时查询等数据开发探索的分析模式,并做了水印、数据质量校验等金融级增强。 +Visualis支持拖拽式报表定义、图表联动、钻取、全局筛选、多维分析、实时查询等数据开发探索的分析模式,并做了水印、数据质量校验等金融级增强。 ## 功能特性 -    基于达芬奇项目, Visualis与DataSphere Studio 1.1.0集成,实现了以下特性: +基于达芬奇项目, Visualis与DataSphere Studio结合,一同实现了以下特性: * 图表水印 * 数据质量校验 * 图表展示优化 * 对接Linkis计算中间件 * Scriptis结果集一键可视化 * 外部应用参数支持 -* View/Widget/Dashboard/Display集成为DataSphere Studio的工作流节点 +* Dashboard/Display集成为DataSphere Studio的工作流节点 -    Visualis同时支持以下Davinci v0.3版本的原生功能: +Visualis同时支持以下Davinci的原生功能: * **数据源** * 支持JDBC数据源 * 支持CSV文件上传 @@ -50,13 +50,13 @@ * 支持仪表板授权分享 -## 与DataSphere Studio集成 +## 与DataSphere Studio继承 -    Visualis与DataSphere Studio的数据开发、工作流调度和数据质量校验等模块无缝衔接,实现数据应用开发全流程的连贯顺滑用户体验。 +Visualis与DataSphere Studio的数据开发、工作流调度和数据质量校验等模块无缝衔接,实现数据应用开发全流程的连贯顺滑用户体验。 -更多使用说明可参考: [Visualis User Manul Doc](./visualis_docs/zh_CN/Visualis_user_manul_cn.md) +更多信息请访问[DataSphere Studio documentations](). -![Visualis](images/visualis_workflow.gif) +![Visualis](images/Visualis_AppJoint.gif) @@ -66,34 +66,12 @@ ## 文档 -## 安装部署文档 -[编译部署文档](visualis_docs/zh_CN/Visualis_deploy_doc_cn.md) +[单独部署文档](visualis_docs/zh_CN/Visualis_deploy_doc_cn.md) -[AppConn安装文档](visualis_docs/zh_CN/Visualis_appconn_install_cn.md) - -## 使用文档 -[用户使用文档](visualis_docs/zh_CN/Visualis_user_manul_cn.md) +[快速对接DSS和Linkis](visualis_docs/zh_CN/Visualis_deploy_doc_cn.md) [Visualis与Davinci的区别](visualis_docs/zh_CN/Visualis_Davinci_difference_cn.md) -## 设计文档 -[Visualis设计文档](visualis_docs/zh_CN/Visualis_design_cn.md) - -[Display和DashBoard预览原理](visualis_docs/zh_CN/Visualis_display_dashboard_privew_cn.md) - -[Visualis接入DSS/Linkis注意点](visualis_docs/zh_CN/Visualis_dss_integration_cn.md) - -[集成LinkisDatasource](visualis_docs/zh_CN/Visualis_linkisdatasource_cn.md) - -[发送邮件实现原理](visualis_docs/zh_CN/Visualis_sendemail_cn.md) - -[绑定sql节点原理](visualis_docs/zh_CN/Visualis_sql_databind_cn.md) - -[虚拟视图设计文档](visualis_docs/zh_CN/Visualis_visual_doc_cn.md) - -## 升级文档 -[升级文档](visualis_docs/zh_CN/visualis_update_cn.md) - ## 交流贡献 ![communication](images/communication.png) diff --git a/README.md b/README.md index d2ab75f3c..2171159e7 100644 --- a/README.md +++ b/README.md @@ -7,21 +7,24 @@ English | [中文](README-CH.md) ## Introduction -    Visualis is an open source project based on Yixin [davinci](https://github.com/edp963/davinci) Developed data visualization Bi tool. It has been integrated into the data application development portal [datasphere studio](https://github.com/WeBankFinTech/DataSphereStudio) In this release, visualis1.0.0 supports linkis1.1.1 and dss1.1.0. -    Visualis provides data development/exploration functionalities including drag & drop style report definition, diagram correlation analysis, data drilling, global filtering, multi-dimensional analysis and real-time query, with the enhancement of report watermark and data quality management. +Visualis is a BI tool for data visualization. It is developed based on the open source project [Davinci](https://github.com/edp963/davinci) contributed by CreditEase. + +Visualis has been integrated into the data application development portal [DataSphere Studio](https://github.com/WeBankFinTech/DataSphereStudio). + +Visualis provides data development/exploration functionalities including drag & drop style report definition, diagram correlation analysis, data drilling, global filtering, multi-dimensional analysis and real-time query, with the enhancement of report watermark and data quality management. ## Features Based on Davinci project, Visualis achieves below features with DataSphere Studio: -* Add chart mark -* Data quality inspection -* Optimize chart display -* Linkis adaption for big-data queries +* Report water mark. +* Data quality inspection. +* Report display optimization. +* Linkis adaption for big-data queries. * One-click visualization from Scriptis -* External application parameters support -* View/Widget/Dashboard/Display as an appjoint of DataSphere Studio workflow +* External application parameters support. +* Dashboard/Display as an appjoint of DataSphere Studio workflow -Visualis also supports most of the original features of Davinci v0.3. +Visualis also supports most of the original features of Davinci. * Data Source Support * Files in CSV format * JDBC data source @@ -42,7 +45,7 @@ Visualis also supports most of the original features of Davinci v0.3. * Local advanced filter for visual components * Paging mode and slider for huge volumes of data * Integration Support - * Download visual components in CSV format + * Upload visual components in CSV format * Share visual components in a common/authorized way * Share dashboard in a common/authorized way @@ -50,44 +53,25 @@ Visualis also supports most of the original features of Davinci v0.3. ## DataSphere Studio Integration Visualis seamlessly integrates with the data develoment, workflow scheduling and data quality management modules of DataSphere Studio, achieving a smooth and consistent user experience across the whole data application development lifecycle. -For more details: [Visualis User Manul Doc](./visualis_docs/en_US/Visualis_user_manul_en.md) +For more detail, please visit [DataSphere Studio documentations](). -![Visualis](images/visualis_workflow.gif) +![Visualis](images/Visualis_AppJoint.gif) + +## Quick start +Click to [Quick start]() -## Architecture design +## Architecture ![Viusalis Architecture](images/architecture.png) -## Documentation - -## Install and deploy documentation -[Compile and deploy documentation](visualis_docs/en_US/Visualis_deploy_doc_en.md) - -[AppConn Installation Documentation](visualis_docs/en_US/Visualis_appconn_install_en.md) - -## User manual -[User documentation](visualis_docs/en_US/Visualis_user_manul_en.md) - -[Difference Between Visualis and Davinci](visualis_docs/en_US/Visualis_Davinci_difference_en.md) - -## Design documentation -[Visualis Design Documentation](visualis_docs/en_US/Visualis_design_en.md) - -[Display and DashBoard preview principle](visualis_docs/en_US/Visualis_display_dashboard_privew_en.md) - -[Visualis access to DSS/Linkis attention points](visualis_docs/en_US/Visualis_dss_integration_en.md) - -[Integrate LinkisDatasource](visualis_docs/en_US/Visualis_linkisdatasource_en.md) - -[How to send emails](visualis_docs/en_US/Visualis_sendemail_en.md) +## Documents -[Principle of binding sql node](visualis_docs/en_US/Visualis_sql_databind_en.md) +[Deploy documentation](visualis_docs/en_US/Visualis_deploy_doc_en.md) -[Virtual View Design Documentation](visualis_docs/en_US/Visualis_visual_doc_en.md) +[Quick integration with DSS and Linkis](visualis_docs/zh_CN/Visualis_dss_integration_cn.md) -## Upgrade documentation -[Upgrade Documentation](visualis_docs/en_US/visualis_update_en.md) +[The differences between Visualis and Davinci](visualis_docs/zh_CN/Visualis_Davinci_difference_cn.md) ## Communication diff --git a/assembly/pom.xml b/assembly/pom.xml index 6f2c8ed00..3a1e9ca11 100644 --- a/assembly/pom.xml +++ b/assembly/pom.xml @@ -3,10 +3,18 @@ xmlns="http://maven.apache.org/POM/4.0.0" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 4.0.0 + + + + + + + + visualis com.webank.wedatasphere.dss - 1.0.0 + 0.5.0 visualis-assembly diff --git a/assembly/src/main/assembly/assembly.xml b/assembly/src/main/assembly/assembly.xml index 3bd980002..7ee2e869c 100644 --- a/assembly/src/main/assembly/assembly.xml +++ b/assembly/src/main/assembly/assembly.xml @@ -30,7 +30,7 @@ ${project.parent.basedir} - ${file.separator} + / README* @@ -43,7 +43,6 @@ * - unix . @@ -62,7 +61,6 @@ upgrade.* upgrade-* - unix @@ -82,12 +80,18 @@ text - - - ${project.parent.basedir}/davinci-ui - - davinci-ui - + + + + + + + + + + + + @@ -100,4 +104,11 @@ false + + diff --git a/assembly/src/main/assembly/distribution.xml b/assembly/src/main/assembly/distribution.xml deleted file mode 100644 index 2c48f8356..000000000 --- a/assembly/src/main/assembly/distribution.xml +++ /dev/null @@ -1,61 +0,0 @@ - - - - visualis-server - - zip - - false - visualis-server - - - - - - lib - true - true - false - false - true - - com.amazonaws:aws-java-sdk-autoscaling:jar - com.amazonaws:aws-java-sdk-core:jar - com.amazonaws:aws-java-sdk-ec2:jar - com.amazonaws:aws-java-sdk-route53:jar - com.amazonaws:aws-java-sdk-sts:jar - com.amazonaws:jmespath-java:jar - javax.ws.rs:jsr311-api:jar - software.amazon.ion:ion-java:jar - - - - - - - ${basedir}/src/main/resources - - * - - 0777 - conf - unix - - - - - diff --git a/assembly/src/main/assembly/release.xml b/assembly/src/main/assembly/release.xml index 03d69c191..aa60daeae 100644 --- a/assembly/src/main/assembly/release.xml +++ b/assembly/src/main/assembly/release.xml @@ -17,6 +17,12 @@ limitations under the License. >> --> + + + + + + release-beta.4 @@ -85,4 +91,11 @@ false + + diff --git a/bin/build.sh b/bin/build.sh new file mode 100644 index 000000000..4ddf3a0cd --- /dev/null +++ b/bin/build.sh @@ -0,0 +1,9 @@ +#!/usr/bin/env bash + +current_dir=`pwd` +script_dir=$(cd `dirname $0`; pwd) +echo $script_dir +echo $current_dir +cd $script_dir +cd .. +cd $current_dir \ No newline at end of file diff --git a/db/davinci.sql b/bin/davinci.sql similarity index 82% rename from db/davinci.sql rename to bin/davinci.sql index 17c818de9..99d8e8457 100644 --- a/db/davinci.sql +++ b/bin/davinci.sql @@ -1,7 +1,9 @@ SET NAMES utf8mb4; SET FOREIGN_KEY_CHECKS = 0; --- 调度任务 +-- ---------------------------- +-- Table structure for cron_job +-- ---------------------------- DROP TABLE IF EXISTS `cron_job`; CREATE TABLE `cron_job` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, @@ -20,9 +22,11 @@ CREATE TABLE `cron_job` ( `update_time` timestamp NULL DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `name_UNIQUE` (`name`) USING BTREE -) ENGINE=InnoDB AUTO_INCREMENT=13 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; +) ENGINE=MyISAM AUTO_INCREMENT=13 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; --- 仪表盘 +-- ---------------------------- +-- Table structure for dashboard +-- ---------------------------- DROP TABLE IF EXISTS `dashboard`; CREATE TABLE `dashboard` ( @@ -30,7 +34,7 @@ CREATE TABLE `dashboard` `name` varchar(255) NOT NULL, `dashboard_portal_id` bigint(20) NOT NULL, `type` smallint(1) NOT NULL, - `index` int(4) NOT NULL, -- 1为文件0为DashBoard + `index` int(4) NOT NULL, `parent_id` bigint(20) NOT NULL DEFAULT '0', `config` text, `full_parent_Id` varchar(100) DEFAULT NULL, @@ -41,9 +45,12 @@ CREATE TABLE `dashboard` PRIMARY KEY (`id`) USING BTREE, KEY `idx_dashboard_id` (`dashboard_portal_id`) USING BTREE, KEY `idx_parent_id` (`parent_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- DashBoard顶部控件 +-- ---------------------------- +-- Table structure for dashboard_portal +-- ---------------------------- DROP TABLE IF EXISTS `dashboard_portal`; CREATE TABLE `dashboard_portal` ( @@ -59,9 +66,12 @@ CREATE TABLE `dashboard_portal` `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- Display看板 +-- ---------------------------- +-- Table structure for display +-- ---------------------------- DROP TABLE IF EXISTS `display`; CREATE TABLE `display` ( @@ -77,9 +87,12 @@ CREATE TABLE `display` `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- Display的画布组件 +-- ---------------------------- +-- Table structure for display_slide +-- ---------------------------- DROP TABLE IF EXISTS `display_slide`; CREATE TABLE `display_slide` ( @@ -93,9 +106,12 @@ CREATE TABLE `display_slide` `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, KEY `idx_display_id` (`display_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 下载记录 +-- ---------------------------- +-- Table structure for download_record +-- ---------------------------- DROP TABLE IF EXISTS `download_record`; CREATE TABLE `download_record` ( @@ -108,9 +124,12 @@ CREATE TABLE `download_record` `last_download_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, KEY `idx_user` (`user_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- 点赞喜欢 +-- ---------------------------- +-- Table structure for favorite +-- ---------------------------- DROP TABLE IF EXISTS `favorite`; CREATE TABLE `favorite` ( @@ -120,9 +139,12 @@ CREATE TABLE `favorite` `create_time` datetime NOT NULL ON UPDATE CURRENT_TIMESTAMP, PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `idx_user_project` (`user_id`, `project_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- dashboard里面wideget成员表 +-- ---------------------------- +-- Table structure for mem_dashboard_widget +-- ---------------------------- DROP TABLE IF EXISTS `mem_dashboard_widget`; CREATE TABLE `mem_dashboard_widget` ( @@ -143,9 +165,12 @@ CREATE TABLE `mem_dashboard_widget` PRIMARY KEY (`id`) USING BTREE, KEY `idx_protal_id` (`dashboard_id`) USING BTREE, KEY `idx_widget_id` (`widget_Id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- Display的画布组件中成员表 +-- ---------------------------- +-- Table structure for mem_display_slide_widget +-- ---------------------------- DROP TABLE IF EXISTS `mem_display_slide_widget`; CREATE TABLE `mem_display_slide_widget` ( @@ -164,9 +189,12 @@ CREATE TABLE `mem_display_slide_widget` PRIMARY KEY (`id`) USING BTREE, KEY `idx_slide_id` (`display_slide_id`) USING BTREE, KEY `idx_widget_id` (`widget_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 组织表 +-- ---------------------------- +-- Table structure for organization +-- ---------------------------- DROP TABLE IF EXISTS `organization`; CREATE TABLE `organization` ( @@ -185,9 +213,12 @@ CREATE TABLE `organization` `update_time` timestamp NULL DEFAULT NULL, `update_by` bigint(20) DEFAULT '0', PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 平台表 +-- ---------------------------- +-- Table structure for platform +-- ---------------------------- DROP TABLE IF EXISTS `platform`; CREATE TABLE `platform` ( @@ -204,9 +235,12 @@ CREATE TABLE `platform` `alternateField4` varchar(255) DEFAULT NULL, `alternateField5` varchar(255) DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 工程项目表 +-- ---------------------------- +-- Table structure for project +-- ---------------------------- DROP TABLE IF EXISTS `project`; CREATE TABLE `project` ( @@ -225,9 +259,12 @@ CREATE TABLE `project` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 工程管理员表 +-- ---------------------------- +-- Table structure for rel_project_admin +-- ---------------------------- DROP TABLE IF EXISTS `rel_project_admin`; CREATE TABLE `rel_project_admin` ( @@ -240,9 +277,12 @@ CREATE TABLE `rel_project_admin` `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `idx_project_user` (`project_id`, `user_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4 COMMENT ='project admin表'; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4 COMMENT ='project admin表'; --- dashboard角色表 +-- ---------------------------- +-- Table structure for rel_role_dashboard +-- ---------------------------- DROP TABLE IF EXISTS `rel_role_dashboard`; CREATE TABLE `rel_role_dashboard` ( @@ -254,9 +294,12 @@ CREATE TABLE `rel_role_dashboard` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`role_id`, `dashboard_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- display角色表 +-- ---------------------------- +-- Table structure for rel_role_display +-- ---------------------------- DROP TABLE IF EXISTS `rel_role_display`; CREATE TABLE `rel_role_display` ( @@ -268,9 +311,12 @@ CREATE TABLE `rel_role_display` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`role_id`, `display_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- dashprotal角色表 +-- ---------------------------- +-- Table structure for rel_role_portal +-- ---------------------------- DROP TABLE IF EXISTS `rel_role_portal`; CREATE TABLE `rel_role_portal` ( @@ -282,9 +328,12 @@ CREATE TABLE `rel_role_portal` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`role_id`, `portal_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- 工程角色表 +-- ---------------------------- +-- Table structure for rel_role_project +-- ---------------------------- DROP TABLE IF EXISTS `rel_role_project`; CREATE TABLE `rel_role_project` ( @@ -304,9 +353,12 @@ CREATE TABLE `rel_role_project` `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `idx_role_project` (`project_id`, `role_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- display slide画布角色表 +-- ---------------------------- +-- Table structure for rel_role_slide +-- ---------------------------- DROP TABLE IF EXISTS `rel_role_slide`; CREATE TABLE `rel_role_slide` ( @@ -318,9 +370,12 @@ CREATE TABLE `rel_role_slide` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`role_id`, `slide_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- 用户角色表 +-- ---------------------------- +-- Table structure for rel_role_user +-- ---------------------------- DROP TABLE IF EXISTS `rel_role_user`; CREATE TABLE `rel_role_user` ( @@ -333,9 +388,12 @@ CREATE TABLE `rel_role_user` `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `idx_role_user` (`user_id`, `role_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- 角色视图表 +-- ---------------------------- +-- Table structure for rel_role_view +-- ---------------------------- DROP TABLE IF EXISTS `rel_role_view`; CREATE TABLE `rel_role_view` ( @@ -348,9 +406,12 @@ CREATE TABLE `rel_role_view` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`view_id`, `role_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 角色组织表 +-- ---------------------------- +-- Table structure for rel_user_organization +-- ---------------------------- DROP TABLE IF EXISTS `rel_user_organization`; CREATE TABLE `rel_user_organization` ( @@ -360,9 +421,12 @@ CREATE TABLE `rel_user_organization` `role` smallint(1) NOT NULL DEFAULT '0', PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `idx_org_user` (`org_id`, `user_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 角色表 +-- ---------------------------- +-- Table structure for role +-- ---------------------------- DROP TABLE IF EXISTS `role`; CREATE TABLE `role` ( @@ -377,9 +441,12 @@ CREATE TABLE `role` `avatar` varchar(255) DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, KEY `idx_orgid` (`org_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4 COMMENT ='权限表'; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4 COMMENT ='权限表'; --- 数据源表 +-- ---------------------------- +-- Table structure for source +-- ---------------------------- DROP TABLE IF EXISTS `source`; CREATE TABLE `source` ( @@ -399,9 +466,12 @@ CREATE TABLE `source` `index` int(5) DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 点赞表 +-- ---------------------------- +-- Table structure for star +-- ---------------------------- DROP TABLE IF EXISTS `star`; CREATE TABLE `star` ( @@ -416,7 +486,9 @@ CREATE TABLE `star` ) ENGINE = InnoDB DEFAULT CHARSET = utf8; --- 用户表 +-- ---------------------------- +-- Table structure for user +-- ---------------------------- DROP TABLE IF EXISTS `user`; CREATE TABLE `user` ( @@ -435,9 +507,12 @@ CREATE TABLE `user` `update_time` timestamp NOT NULL DEFAULT '1970-01-01 08:00:01', `update_by` bigint(20) NOT NULL DEFAULT '0', PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- 视图view表 +-- ---------------------------- +-- Table structure for view +-- ---------------------------- DROP TABLE IF EXISTS `view`; CREATE TABLE `view` ( @@ -460,16 +535,19 @@ CREATE TABLE `view` `index` int(5) DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; --- widget表 +-- ---------------------------- +-- Table structure for widget +-- ---------------------------- DROP TABLE IF EXISTS `widget`; CREATE TABLE `widget` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `name` varchar(255) NOT NULL, `description` varchar(255) DEFAULT NULL, - `view_id` bigint(20), -- 兼容SQL节点作为Source,需要支持null + `view_id` bigint(20) NOT NULL, `project_id` bigint(20) NOT NULL, `type` bigint(20) NOT NULL, `publish` tinyint(1) NOT NULL, @@ -485,9 +563,10 @@ CREATE TABLE `widget` PRIMARY KEY (`id`) USING BTREE, KEY `idx_project_id` (`project_id`) USING BTREE, KEY `idx_view_id` (`view_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8; + --- display画布上widget角色权限表 DROP TABLE IF EXISTS `rel_role_display_slide_widget`; CREATE TABLE `rel_role_display_slide_widget` ( @@ -499,9 +578,10 @@ CREATE TABLE `rel_role_display_slide_widget` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`role_id`, `mem_display_slide_widget_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + --- dashboard上widget角色权限表 DROP TABLE IF EXISTS `rel_role_dashboard_widget`; CREATE TABLE `rel_role_dashboard_widget` ( @@ -513,9 +593,9 @@ CREATE TABLE `rel_role_dashboard_widget` `update_by` bigint(20) DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`role_id`, `mem_dashboard_widget_id`) USING BTREE -) ENGINE = InnoDB DEFAULT CHARSET = utf8mb4; +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; --- 访问和操作记录统计表 DROP TABLE IF EXISTS `davinci_statistic_visitor_operation`; CREATE TABLE `davinci_statistic_visitor_operation` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, @@ -539,7 +619,6 @@ CREATE TABLE `davinci_statistic_visitor_operation` ( PRIMARY KEY (`id`) USING BTREE ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- 客户端终端信息统计表 DROP TABLE IF EXISTS `davinci_statistic_terminal`; CREATE TABLE `davinci_statistic_terminal` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, @@ -559,7 +638,7 @@ CREATE TABLE `davinci_statistic_terminal` ( PRIMARY KEY (`id`) USING BTREE ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- 停留时间统计表 + DROP TABLE IF EXISTS `davinci_statistic_duration`; CREATE TABLE `davinci_statistic_duration` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, @@ -570,7 +649,6 @@ CREATE TABLE `davinci_statistic_duration` ( PRIMARY KEY (`id`) USING BTREE ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- 分享和下载记录表 DROP TABLE IF EXISTS `share_download_record`; CREATE TABLE `share_download_record` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, @@ -583,12 +661,7 @@ CREATE TABLE `share_download_record` ( PRIMARY KEY (`id`) USING BTREE ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4; --- alter table widget modify view_id bigint null; SET FOREIGN_KEY_CHECKS = 1; -DELETE FROM source; -INSERT INTO `source` ( - id,name,description,config,type,project_id,create_by,create_time,update_by,update_time,parent_id,full_parent_id,is_folder,`index`) -VALUES ( - 1,'hiveDataSource','','{"parameters":"","password":"","url":"test","username":"hiveDataSource-token"}','hive',-1,null,null,null,null,null,null,null,null); +INSERT INTO `source` (id,name,description,config,type,project_id,create_by,create_time,update_by,update_time,parent_id,full_parent_id,is_folder,`index`) VALUES (1,'hiveDataSource','','{"parameters":"","password":"","url":"test","username":"hiveDataSource-token"}','hive',-1,null,null,null,null,null,null,null,null); diff --git a/bin/initdb.bat b/bin/initdb.bat new file mode 100644 index 000000000..725a8a368 --- /dev/null +++ b/bin/initdb.bat @@ -0,0 +1,22 @@ +:: << +:: Davinci +:: == +:: Copyright (C) 2016 - 2019 EDP +:: == +:: Licensed under the Apache License, Version 2.0 (the "License"); +:: you may not use this file except in compliance with the License. +:: You may obtain a copy of the License at +:: http://www.apache.org/licenses/LICENSE-2.0 +:: Unless required by applicable law or agreed to in writing, software +:: distributed under the License is distributed on an "AS IS" BASIS, +:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +:: See the License for the specific language governing permissions and +:: limitations under the License. +:: >> + +@echo off + +for %%x in ("%MYSQL_HOME%") do set MYSQL_HOME=%%~sx +for %%x in ("%DAVINCI3_HOME%") do set DAVINCI3_HOME=%%~sx + +%MYSQL_HOME%\bin\mysql.exe -h localhost -uroot -proot davinci0.3 < %DAVINCI3_HOME%\bin\davinci.sql \ No newline at end of file diff --git a/bin/initdb.sh b/bin/initdb.sh new file mode 100644 index 000000000..bbf1cb229 --- /dev/null +++ b/bin/initdb.sh @@ -0,0 +1,2 @@ +#!/bin/bash +mysql -P 3306 -h localhost -u root -proot davinci0.3 < $DAVINCI3_HOME/bin/davinci.sql diff --git a/bin/migration/README.md b/bin/migration/README.md new file mode 100644 index 000000000..e7fd3060a --- /dev/null +++ b/bin/migration/README.md @@ -0,0 +1,14 @@ +### 注意! + +1. **升级前请务必备份数据!!!, 升级前请务必备份数据!!!, 升级前请务必备份数据!!!** +2. 本次升级只针对 davinci0.3 beta.4 升级至 beta.5, 其他版本请不要执行! +3. 已安装 beta.5 及之后版本无须执行此脚本; +4. 本次升级默认读取 config 下 application.yml 中配置的 davinci 数据源,也可通过参数指定,更多信息请执行 ‘upgrade -help’ 查看; +5. 本次升级可能造成部分 View 中定义的`变量值`错误,请手动修改; +6. 升级脚本为二进制文件,不同平台执行相应脚本即可,无须重复执行: + + | 平台 | 对应脚本 | + | --- | --- | + |Windows | upgrade.exe | + |Mac OS | upgrade_darwin | + |Linux | upgrade_linux | \ No newline at end of file diff --git a/bin/migration/upgrade.exe b/bin/migration/upgrade.exe new file mode 100644 index 000000000..498d9896e Binary files /dev/null and b/bin/migration/upgrade.exe differ diff --git a/bin/migration/upgrade_darwin b/bin/migration/upgrade_darwin new file mode 100644 index 000000000..c512ed29c Binary files /dev/null and b/bin/migration/upgrade_darwin differ diff --git a/bin/migration/upgrade_linux b/bin/migration/upgrade_linux new file mode 100644 index 000000000..907d23b48 Binary files /dev/null and b/bin/migration/upgrade_linux differ diff --git a/bin/patch/001_beta5.sql b/bin/patch/001_beta5.sql new file mode 100644 index 000000000..38901d6ae --- /dev/null +++ b/bin/patch/001_beta5.sql @@ -0,0 +1,1191 @@ +set @data_base = 'davinci0.3'; + +DROP TABLE IF EXISTS `platform`; +CREATE TABLE `platform` +( + `id` bigint(20) NOT NULL, + `name` varchar(255) NOT NULL, + `platform` varchar(255) NOT NULL, + `code` varchar(32) NOT NULL, + `checkCode` varchar(255) DEFAULT NULL, + `checkSystemToken` varchar(255) DEFAULT NULL, + `checkUrl` varchar(255) DEFAULT NULL, + `alternateField1` varchar(255) DEFAULT NULL, + `alternateField2` varchar(255) DEFAULT NULL, + `alternateField3` varchar(255) DEFAULT NULL, + `alternateField4` varchar(255) DEFAULT NULL, + `alternateField5` varchar(255) DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + +DROP TABLE IF EXISTS `download_record`; +CREATE TABLE `download_record` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `name` varchar(255) NOT NULL, + `user_id` bigint(20) NOT NULL, + `path` varchar(255) DEFAULT NULL, + `status` smallint(1) NOT NULL, + `create_time` datetime NOT NULL, + `last_download_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE, + KEY `idx_user` (`user_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_project_admin`; +CREATE TABLE `rel_project_admin` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `project_id` bigint(20) NOT NULL, + `user_id` bigint(20) NOT NULL, + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE, + UNIQUE KEY `idx_project_user` (`project_id`, `user_id`) USING BTREE +) ENGINE = InnoDB + AUTO_INCREMENT = 6 + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_dashboard`; +CREATE TABLE `rel_role_dashboard` +( + `role_id` bigint(20) NOT NULL, + `dashboard_id` bigint(20) NOT NULL, + `visible` tinyint(1) NOT NULL DEFAULT '0', + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`role_id`, `dashboard_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_display`; +CREATE TABLE `rel_role_display` +( + `role_id` bigint(20) NOT NULL, + `display_id` bigint(20) NOT NULL, + `visible` tinyint(1) NOT NULL DEFAULT '0', + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`role_id`, `display_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_portal`; +CREATE TABLE `rel_role_portal` +( + `role_id` bigint(20) NOT NULL, + `portal_id` bigint(20) NOT NULL, + `visible` tinyint(1) NOT NULL DEFAULT '0', + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`role_id`, `portal_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_project`; +CREATE TABLE `rel_role_project` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `project_id` bigint(20) NOT NULL, + `role_id` bigint(20) NOT NULL, + `source_permission` smallint(1) NOT NULL DEFAULT '1', + `view_permission` smallint(1) NOT NULL DEFAULT '1', + `widget_permission` smallint(1) NOT NULL DEFAULT '1', + `viz_permission` smallint(1) NOT NULL DEFAULT '1', + `schedule_permission` smallint(1) NOT NULL DEFAULT '1', + `share_permission` tinyint(1) NOT NULL DEFAULT '0', + `download_permission` tinyint(1) NOT NULL DEFAULT '0', + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE, + UNIQUE KEY `idx_role_project` (`project_id`, `role_id`) USING BTREE +) ENGINE = InnoDB + AUTO_INCREMENT = 40 + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_slide`; +CREATE TABLE `rel_role_slide` +( + `role_id` bigint(20) NOT NULL, + `slide_id` bigint(20) NOT NULL, + `visible` tinyint(1) NOT NULL DEFAULT '0', + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`role_id`, `slide_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_user`; +CREATE TABLE `rel_role_user` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `user_id` bigint(20) NOT NULL, + `role_id` bigint(20) NOT NULL, + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE, + UNIQUE KEY `idx_role_user` (`user_id`, `role_id`) USING BTREE +) ENGINE = InnoDB + AUTO_INCREMENT = 30 + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_view`; +CREATE TABLE `rel_role_view` +( + `view_id` bigint(20) NOT NULL, + `role_id` bigint(20) NOT NULL, + `row_auth` text, + `column_auth` text, + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`view_id`, `role_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `role`; +CREATE TABLE `role` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `org_id` bigint(20) NOT NULL, + `name` varchar(100) NOT NULL, + `description` varchar(255) DEFAULT NULL, + `avatar` varchar(255) DEFAULT NULL, + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE, + KEY `idx_orgid` (`org_id`) USING BTREE +) ENGINE = InnoDB + AUTO_INCREMENT = 24 + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_display_slide_widget`; +CREATE TABLE `rel_role_display_slide_widget` +( + `role_id` bigint(20) NOT NULL, + `mem_display_slide_widget_id` bigint(20) NOT NULL, + `visible` tinyint(1) NOT NULL DEFAULT '0', + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`role_id`, `mem_display_slide_widget_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `rel_role_dashboard_widget`; +CREATE TABLE `rel_role_dashboard_widget` +( + `role_id` bigint(20) NOT NULL, + `mem_dashboard_widget_id` bigint(20) NOT NULL, + `visible` tinyint(1) NOT NULL DEFAULT '0', + `create_by` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `update_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + PRIMARY KEY (`role_id`, `mem_dashboard_widget_id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +ALTER TABLE `organization` + ADD INDEX `idx_user_id` (`user_id`), + ADD INDEX `idx_allow_create_project` (`allow_create_project`), + ADD INDEX `idx_member_permisson` (`member_permission`); + +ALTER TABLE `project` + ADD INDEX `idx_org_id` (`org_id`), + ADD INDEX `idx_user_id` (`user_id`), + ADD INDEX `idx_visibility` (`visibility`); + +ALTER TABLE `rel_user_organization` + ADD INDEX `idx_role` (`role`); + + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'cron_job' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `cron_job` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'cron_job' + AND column_name = 'parent_id') > 0, + "SELECT 1", + "ALTER TABLE `cron_job` ADD `parent_id` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'cron_job' + AND column_name = 'full_parent_id') > 0, + "SELECT 1", + "ALTER TABLE `cron_job` ADD `full_parent_id` varchar(100) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'cron_job' + AND column_name = 'is_folder') > 0, + "SELECT 1", + "ALTER TABLE `cron_job` ADD `is_folder` tinyint(1) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'cron_job' + AND column_name = 'index') > 0, + "SELECT 1", + "ALTER TABLE `cron_job` ADD `index` int(5) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard' + AND column_name = 'full_parent_Id') > 0, + "SELECT 1", + "ALTER TABLE `dashboard` ADD `full_parent_Id` varchar(100) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `dashboard` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `dashboard` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `dashboard` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `dashboard` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard_portal' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `dashboard_portal` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard_portal' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `dashboard_portal` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard_portal' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `dashboard_portal` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'dashboard_portal' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `dashboard_portal` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `display` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `display` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `display` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `display` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display_slide' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `display_slide` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display_slide' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `display_slide` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display_slide' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `display_slide` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'display_slide' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `display_slide` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_dashboard_widget' + AND column_name = 'config') > 0, + "SELECT 1", + "ALTER TABLE `mem_dashboard_widget` ADD `config` text;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_dashboard_widget' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `mem_dashboard_widget` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_dashboard_widget' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `mem_dashboard_widget` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_dashboard_widget' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `mem_dashboard_widget` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_dashboard_widget' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `mem_dashboard_widget` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_display_slide_widget' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `mem_display_slide_widget` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_display_slide_widget' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `mem_display_slide_widget` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_display_slide_widget' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `mem_display_slide_widget` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'mem_display_slide_widget' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `mem_display_slide_widget` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'project' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `project` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'project' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `project` ADD `create_time` datetime NULL DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'project' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `project` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'project' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `project` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'parent_id') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `parent_id` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'full_parent_id') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `full_parent_id` varchar(255) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'is_folder') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `is_folder` tinyint(1) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'source' + AND column_name = 'index') > 0, + "SELECT 1", + "ALTER TABLE `source` ADD `index` int(5) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'variable') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `variable` text;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'parent_id') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `parent_id` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'full_parent_id') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `full_parent_id` varchar(255) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'is_folder') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `is_folder` tinyint(1) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'view' + AND column_name = 'index') > 0, + "SELECT 1", + "ALTER TABLE `view` ADD `index` int(5) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'rel_user_organization' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `rel_user_organization` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'rel_user_organization' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `rel_user_organization` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'rel_user_organization' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `rel_user_organization` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'rel_user_organization' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `rel_user_organization` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'create_by') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `create_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'create_time') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `create_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'update_by') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `update_by` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'update_time') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `update_time` datetime DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'parent_id') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `parent_id` bigint(20) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'full_parent_id') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `full_parent_id` varchar(255) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'is_folder') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `is_folder` varchar(255) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'widget' + AND column_name = 'index') > 0, + "SELECT 1", + "ALTER TABLE `widget` ADD `index` int(5) DEFAULT NULL;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + + +SET @s = (SELECT IF( + (SELECT COUNT(*) + FROM INFORMATION_SCHEMA.COLUMNS + WHERE table_schema = @data_base + AND table_name = 'organization' + AND column_name = 'role_num') > 0, + "SELECT 1", + "ALTER TABLE `organization` CHANGE COLUMN `team_num` `role_num` int(20) NULL DEFAULT 0 AFTER `member_num`;" + )); + +PREPARE stmt FROM @s; +EXECUTE stmt; +DEALLOCATE PREPARE stmt; + +update organization +set role_num = 0; + +update download_record +set status = 4 +where last_download_time is not null + and status = 2; diff --git a/bin/patch/002_beta7.sql b/bin/patch/002_beta7.sql new file mode 100644 index 000000000..70d4864e5 --- /dev/null +++ b/bin/patch/002_beta7.sql @@ -0,0 +1,74 @@ +DROP TABLE IF EXISTS `davinci_statistic_visitor_operation`; +CREATE TABLE `davinci_statistic_visitor_operation` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `user_id` bigint(20) DEFAULT NULL, + `email` varchar(255) DEFAULT NULL, + `action` varchar(255) DEFAULT NULL COMMENT 'login/visit/initial/sync/search/linkage/drill/download/print', + `org_id` bigint(20) DEFAULT NULL, + `project_id` bigint(20) DEFAULT NULL, + `project_name` varchar(255) DEFAULT NULL, + `viz_type` varchar(255) DEFAULT NULL COMMENT 'dashboard/display', + `viz_id` bigint(20) DEFAULT NULL, + `viz_name` varchar(255) DEFAULT NULL, + `sub_viz_id` bigint(20) DEFAULT NULL, + `sub_viz_name` varchar(255) DEFAULT NULL, + `widget_id` bigint(20) DEFAULT NULL, + `widget_name` varchar(255) DEFAULT NULL, + `variables` varchar(500) DEFAULT NULL, + `filters` varchar(500) DEFAULT NULL, + `groups` varchar(500) DEFAULT NULL, + `create_time` timestamp NULL DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + +DROP TABLE IF EXISTS `davinci_statistic_terminal`; +CREATE TABLE `davinci_statistic_terminal` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `user_id` bigint(20) DEFAULT NULL, + `email` varchar(255) DEFAULT NULL, + `browser_name` varchar(255) DEFAULT NULL, + `browser_version` varchar(255) DEFAULT NULL, + `engine_name` varchar(255) DEFAULT NULL, + `engine_version` varchar(255) DEFAULT NULL, + `os_name` varchar(255) DEFAULT NULL, + `os_version` varchar(255) DEFAULT NULL, + `device_model` varchar(255) DEFAULT NULL, + `device_type` varchar(255) DEFAULT NULL, + `device_vendor` varchar(255) DEFAULT NULL, + `cpu_architecture` varchar(255) DEFAULT NULL, + `create_time` timestamp NULL DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `davinci_statistic_duration`; +CREATE TABLE `davinci_statistic_duration` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `user_id` bigint(20) DEFAULT NULL, + `email` varchar(255) DEFAULT NULL, + `start_time` timestamp NULL DEFAULT NULL, + `end_time` timestamp NULL DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + + +DROP TABLE IF EXISTS `share_download_record`; +CREATE TABLE `share_download_record` +( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `uuid` varchar(50) DEFAULT NULL, + `name` varchar(255) NOT NULL, + `path` varchar(255) DEFAULT NULL, + `status` smallint(1) NOT NULL, + `create_time` datetime NOT NULL, + `last_download_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE +) ENGINE = InnoDB + DEFAULT CHARSET = utf8mb4; + diff --git a/bin/phantomjs b/bin/phantomjs deleted file mode 100644 index d72e801ce..000000000 Binary files a/bin/phantomjs and /dev/null differ diff --git a/bin/run.bat b/bin/run.bat new file mode 100644 index 000000000..c2988f5c7 --- /dev/null +++ b/bin/run.bat @@ -0,0 +1,32 @@ +:: << +:: Davinci +:: == +:: Copyright (C) 2016 - 2019 EDP +:: == +:: Licensed under the Apache License, Version 2.0 (the "License"); +:: you may not use this file except in compliance with the License. +:: You may obtain a copy of the License at +:: http://www.apache.org/licenses/LICENSE-2.0 +:: Unless required by applicable law or agreed to in writing, software +:: distributed under the License is distributed on an "AS IS" BASIS, +:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +:: See the License for the specific language governing permissions and +:: limitations under the License. +:: >> + +@echo off + +for %%x in ("%JAVA_HOME%") do set JAVA_HOME=%%~sx +for %%x in ("%DAVINCI3_HOME%") do set DAVINCI3_HOME=%%~sx + +if "%1" == "start" ( + echo start Davinci Server + start "Davinci Server" java -Dfile.encoding=UTF-8 -cp .;%JAVA_HOME%\lib\*;%DAVINCI3_HOME%\lib\*; edp.DavinciServerApplication --spring.config.additional-location=file:%DAVINCI3_HOME%\config\application.yml +) else if "%1" == "stop" ( + echo stop Davinci Server + taskkill /fi "WINDOWTITLE eq Davinci Server" +) else ( + echo please use "run.bat start" or "run.bat stop" +) + +pause \ No newline at end of file diff --git a/bin/start-visualis-server.sh b/bin/start-visualis-server.sh index a05483e70..b3fecd75c 100644 --- a/bin/start-visualis-server.sh +++ b/bin/start-visualis-server.sh @@ -34,7 +34,7 @@ export DWS_ENGINE_DEBUG="-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,a export DWS_ENGINE_ANAGER_HEAP_SIZE="4G" export DWS_ENGINE_ANAGER_JAVA_OPTS="-Xms$DWS_ENGINE_ANAGER_HEAP_SIZE -Xmx$DWS_ENGINE_ANAGER_HEAP_SIZE -XX:+UseG1GC -XX:MaxPermSize=500m $DWS_ENGINE_DEBUG" -nohup java $DWS_ENGINE_ANAGER_JAVA_OPTS -cp $HOME/conf:$HOME/lib/*:$JAVA_HOME/lib/* org.apache.linkis.DataWorkCloudApplication 2>&1 > $DWS_ENGINE_ANAGER_LOG_PATH/linkis.out & +nohup java $DWS_ENGINE_ANAGER_JAVA_OPTS -cp $HOME/conf:$HOME/lib/*:$JAVA_HOME/lib/* com.webank.wedatasphere.linkis.DataWorkCloudApplication 2>&1 > $DWS_ENGINE_ANAGER_LOG_PATH/linkis.out & pid=$! if [[ -z "${pid}" ]]; then echo "visualis-server start failed!" diff --git a/bin/start.bat b/bin/start.bat new file mode 100644 index 000000000..5430e6a86 --- /dev/null +++ b/bin/start.bat @@ -0,0 +1,19 @@ +:: << +:: Davinci +:: == +:: Copyright (C) 2016 - 2019 EDP +:: == +:: Licensed under the Apache License, Version 2.0 (the "License"); +:: you may not use this file except in compliance with the License. +:: You may obtain a copy of the License at +:: http://www.apache.org/licenses/LICENSE-2.0 +:: Unless required by applicable law or agreed to in writing, software +:: distributed under the License is distributed on an "AS IS" BASIS, +:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +:: See the License for the specific language governing permissions and +:: limitations under the License. +:: >> + +@echo off +call run.bat start +exit \ No newline at end of file diff --git a/bin/stop.bat b/bin/stop.bat new file mode 100644 index 000000000..7560aa234 --- /dev/null +++ b/bin/stop.bat @@ -0,0 +1,19 @@ +:: << +:: Davinci +:: == +:: Copyright (C) 2016 - 2019 EDP +:: == +:: Licensed under the Apache License, Version 2.0 (the "License"); +:: you may not use this file except in compliance with the License. +:: You may obtain a copy of the License at +:: http://www.apache.org/licenses/LICENSE-2.0 +:: Unless required by applicable law or agreed to in writing, software +:: distributed under the License is distributed on an "AS IS" BASIS, +:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +:: See the License for the specific language governing permissions and +:: limitations under the License. +:: >> + +@echo off +call run.bat stop +exit \ No newline at end of file diff --git a/conf/application.yml b/conf/application.yml index dbeccfb9d..d4474fa1c 100644 --- a/conf/application.yml +++ b/conf/application.yml @@ -1,45 +1,44 @@ -# ################################## -# 1. Visualis Service configuration -# ################################## server: protocol: http - address: 127.0.0.1 # server ip address - port: 9008 # server port - url: http://127.0.0.1:8088/dss/visualis # frontend index page full path - access: - address: 127.0.0.1 # frontend address - port: 8088 # frontend port - - -# ################################## -# 2. eureka configuration -# ################################## + address: 127.0.0.1 #server ip address + port: 9007 #该模块提供服务的端口(必须) + url: http://0.0.0.0:0000/dws/visualis #frontend index page full path + access: + address: 0.0.0.0 #frontend address + port: 0000 #frontend port + +#指定eureka Server的地址,用于注册(必须) eureka: client: serviceUrl: - defaultZone: http://127.0.0.1:20303/eureka/ # Configuration required + defaultZone: http://127.0.0.1:20303/eureka/ instance: metadata-map: test: wedatasphere +#(必须) management: endpoints: web: exposure: include: refresh,info +logging: + config: classpath:log4j2.xml + +file: + userfiles-path: ${DAVINCI3_HOME}/userfiles + web_resources: ${DAVINCI3_HOME}/davinci-ui/ + phantomJs-path: ${DAVINCI3_HOME}/bin/phantom.js + base-path: ${DAVINCI3_HOME} -# ################################## -# 3. Spring configuration -# ################################## spring: - main: - allow-bean-definition-overriding: true application: - name: visualis-dev - datasource: # visualis需要和dss部署在同一个数据库 - url: jdbc:mysql://127.0.0.1:3306/dss?characterEncoding=UTF-8&allowMultiQueries=true # Configuration required - username: hadoop - password: hadoop + name: visualis #模块名,用于做高可用(必须) + ## davinci datasouce config + datasource: + url: jdbc:mysql://127.0.0.1:3306/xxx?characterEncoding=UTF-8 #application mysql database jdbc url + username: xxx #application mysql database username + password: xxx #application mysql database password driver-class-name: com.mysql.jdbc.Driver initial-size: 2 min-idle: 1 @@ -73,6 +72,7 @@ spring: mvc: static-path-pattern: /** + thymeleaf: mode: HTML5 cache: true @@ -106,25 +106,14 @@ spring: smtp: ssl: enable: false -logging: - config: classpath:log4j2.xml - -# ################################## -# 4. static resource configuration -# ################################## -file: - userfiles-path: ${DAVINCI3_HOME}/userfiles - web_resources: ${DAVINCI3_HOME}/davinci-ui/ - base-path: ${DAVINCI3_HOME} - -sql_template_delimiter: $ -custom-datasource-driver-path: ${DAVINCI3_HOME}/conf/datasource_driver.yml +springfox: + documentation: + swagger: + v2: + path: /api-doc -# ################################## -# 5. SQL configuration -# ################################## pagehelper: supportMethodsArguments: true reasonable: true @@ -145,14 +134,16 @@ mapper: not-empty: false mappers: edp.davinci.dao +sql_template_delimiter: $ + +custom-datasource-driver-path: ${DAVINCI3_HOME}/conf/datasource_driver.yml + +phantomjs_home: ${DAVINCI3_HOME}/bin/phantomjs -# ################################## -# 6. Screenshot drive -# ################################## email: suffix: "" screenshot: - default_browser: PHANTOMJS + default_browser: PHANTOMJS # PHANTOMJS or CHROME timeout_second: 1800 phantomjs_path: ${DAVINCI3_HOME}/bin/phantomjs chromedriver_path: $your_chromedriver_path$ diff --git a/conf/linkis.properties b/conf/linkis.properties index 019adc6a2..e4371a650 100644 --- a/conf/linkis.properties +++ b/conf/linkis.properties @@ -1,30 +1,24 @@ -# ################################## -# 1. need configuration -# 需要配置 -# ################################## -wds.linkis.gateway.url=http://127.0.0.1:9001 - -# 需配置和Linkis结果集路径保持一致 -wds.linkis.filesystem.root.path=file:///mnt/bdap/ -wds.linkis.filesystem.hdfs.root.path=hdfs:///tmp/linkis - -# ################################## -# 2. can keep the default configuration -# 可以保持默认配置 -# ################################## - -wds.dss.visualis.query.timeout=1200000 - -wds.linkis.server.version=v1 - +#是否启动测试默认 wds.linkis.test.mode=false wds.linkis.test.user=test +#Restful扫描的package +wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.linkis.entrance.restful,com.webank.wedatasphere.dss.visualis.restful +wds.linkis.engine.application.name=sparkEngine +wds.linkis.enginemanager.application.name=sparkEngineManager + +wds.linkis.query.application.name=cloud-publicservice -wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.dss.visualis.restful +wds.linkis.console.config.application.name=cloud-publicservice +wds.linkis.engine.creation.wait.time.max=20m +wds.linkis.server.socket.mode=false + +wds.linkis.server.distinct.mode=true +wds.linkis.entrance.config.logPath=file:///tmp/linkis/ wds.dss.visualis.project.name=default -wds.dss.engine.allowed.creators=Visualis,nodeexecution,IDE -wds.linkis.max.ask.executor.time=45m -wds.linkis.server.component.exclude.classes=com.webank.wedatasphere.linkis.entrance.parser.SparkJobParser -wds.dss.visualis.creator=Visualis \ No newline at end of file +wds.linkis.server.version=v1 +wds.linkis.resultSet.store.path=hdfs:///tmp/linkis +wds.dss.visualis.gateway.ip= +wds.dss.visualis.gateway.port= +wds.dss.visualis.query.timeout=1200000 \ No newline at end of file diff --git a/conf/log4j.properties b/conf/log4j.properties new file mode 100644 index 000000000..0807e6087 --- /dev/null +++ b/conf/log4j.properties @@ -0,0 +1,37 @@ +# +# Copyright 2019 WeBank +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO diff --git a/db/ddl.sql b/db/ddl.sql deleted file mode 100644 index de4ad025f..000000000 --- a/db/ddl.sql +++ /dev/null @@ -1,51 +0,0 @@ -CREATE TABLE `visualis_config` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `key` varchar(255) DEFAULT NULL, - `value` varchar(255) DEFAULT NULL, - `scope` varchar(255) DEFAULT NULL, - `username` varchar(255) DEFAULT NULL, - `params` longtext, - PRIMARY KEY (`id`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8; - -DROP TABLE IF EXISTS `visualis_project`; -CREATE TABLE `visualis_project` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(200) COLLATE utf8_bin DEFAULT NULL, - `description` text COLLATE utf8_bin, - `pic` varchar(255) COLLATE utf8_bin DEFAULT NULL, - `org_id` bigint(20) DEFAULT NULL COMMENT 'Organization ID', - `user_id` bigint(20) DEFAULT NULL, - `star_num` int(11) DEFAULT '0', - `username` varchar(32) COLLATE utf8_bin DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `create_by` varchar(128) COLLATE utf8_bin DEFAULT NULL COMMENT '创建人', - `update_time` datetime DEFAULT NULL, - `update_by` varchar(128) COLLATE utf8_bin DEFAULT NULL COMMENT '修改人', - `visibility` bit(1) DEFAULT NULL, - `is_transfer` bit(1) DEFAULT NULL COMMENT 'Reserved word', - `initial_org_id` bigint(20) DEFAULT NULL, - `isArchive` bit(1) DEFAULT b'0' COMMENT 'If it is archived', - PRIMARY KEY (`id`) -) ENGINE=InnoDB AUTO_INCREMENT=313 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; - --- visualis使用该表做权限管理 -DROP TABLE IF EXISTS `visualis_user`; -CREATE TABLE `visualis_user` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `email` varchar(255) DEFAULT NULL, - `username` varchar(255) NOT NULL, - `password` varchar(255) DEFAULT NULL, - `admin` tinyint(1) DEFAULT NULL COMMENT 'If it is an administrator', - `active` tinyint(1) DEFAULT NULL COMMENT 'If it is active', - `name` varchar(255) DEFAULT NULL COMMENT 'User name', - `description` varchar(255) DEFAULT NULL, - `department` varchar(255) DEFAULT NULL, - `avatar` varchar(255) DEFAULT NULL COMMENT 'Path of the avator', - `create_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, - `create_by` bigint(20) DEFAULT '0', - `update_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, - `update_by` bigint(20) DEFAULT '0', - `is_first_login` bit(1) DEFAULT NULL COMMENT 'If it is the first time to log in', - PRIMARY KEY (`id`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8; \ No newline at end of file diff --git a/docs/_docs/zh/1.1-deployment.md b/docs/_docs/zh/1.1-deployment.md index 331f3f408..5461ffcd4 100644 --- a/docs/_docs/zh/1.1-deployment.md +++ b/docs/_docs/zh/1.1-deployment.md @@ -164,7 +164,7 @@ cache 这里用 redis 作为缓存服务,配置如下: spring: redis: isEnable: false - host: 127.0.0.1 + host: 10.143.131.119 port: 6379 # cluster: diff --git a/ext/pf.ttf b/ext/pf.ttf deleted file mode 100644 index 96d0db72a..000000000 Binary files a/ext/pf.ttf and /dev/null differ diff --git a/images/communication.png b/images/communication.png index 55a0dde5d..12e86727d 100644 Binary files a/images/communication.png and b/images/communication.png differ diff --git a/images/visualis_workflow.gif b/images/visualis_workflow.gif deleted file mode 100644 index 96d268ede..000000000 Binary files a/images/visualis_workflow.gif and /dev/null differ diff --git a/pom.xml b/pom.xml index 955d9fffd..af6a3dd98 100644 --- a/pom.xml +++ b/pom.xml @@ -3,9 +3,17 @@ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 4.0.0 + + + + + + + + visualis com.webank.wedatasphere.dss - 1.0.0 + 0.5.0 pom @@ -19,33 +27,14 @@ assembly server - visualis-appconn ${project.basedir} UTF-8 UTF-8 - 2.11.8 1.8 UTF-8 - 2.12 - 2.15.2 - 1.8 - true - - 1.1.1 - 1.1.0 - - 5.1.2 - 2.3.7.RELEASE - 2.2.1.RELEASE - 5.2.12.RELEASE - 1.4.19 - 1.26 - 2.6.2 - 2.17.1 - 3.13 @@ -98,9 +87,7 @@ 1024m true - - ${java.home}/lib/rt.jar${path.separator}${java.home}/lib/jce.jar${path.separator}${java.home}/../lib/tools.jar - + ${java.home}/lib/rt.jar${path.separator}${java.home}/lib/jce.jar${path.separator}${java.home}/../lib/tools.jar -Xlint:all,-serial,-path @@ -116,16 +103,6 @@ - - org.projectlombok - lombok - 1.16.22 - - - org.projectlombok - lombok-maven-plugin - 1.16.8.0 - diff --git a/server/README.md b/server/README.md new file mode 100644 index 000000000..e69de29bb diff --git a/server/pom.xml b/server/pom.xml index 066c100e8..d5d70a2ea 100644 --- a/server/pom.xml +++ b/server/pom.xml @@ -1,141 +1,45 @@ + 4.0.0 + + visualis-server + jar + + + + + + + visualis com.webank.wedatasphere.dss - 1.0.0 - ../pom.xml + 0.5.0 - 4.0.0 - visualis-server - jar + + UTF-8 + UTF-8 + 1.8 + true + 6.4.2 + 5.1.2 + 2.0.3.RELEASE + - - com.thoughtworks.xstream - xstream - ${xstream.core.version} - - - - org.apache.logging.log4j - log4j-api - ${log4j2.version} - - - org.apache.logging.log4j - log4j-core - ${log4j2.version} - - - org.apache.logging.log4j - log4j-slf4j-impl - ${log4j2.version} - - - - org.apache.logging.log4j - log4j-1.2-api - ${log4j2.version} - - javax.validation validation-api 2.0.1.Final - - com.webank.wedatasphere.dss - spring-origin-sso-integration-plugin - ${opensource.dss.version} - - - spring-core - org.springframework - - - - - - com.webank.wedatasphere.dss - dss-project-plugin - ${opensource.dss.version} - - - - org.apache.linkis - linkis-module - ${apache.linkis.version} - - - - org.apache.linkis - linkis-common - ${apache.linkis.version} - - - - org.apache.linkis - linkis-computation-client - ${apache.linkis.version} - - - - org.apache.linkis - linkis-cs-client - ${apache.linkis.version} - - - - org.apache.linkis - linkis-storage - ${apache.linkis.version} - - - - org.apache.linkis - linkis-io_file-client - ${apache.linkis.version} - - - - org.apache.linkis - linkis-rpc - ${apache.linkis.version} - - - spring-aop - org.springframework - - - spring-cloud-commons - org.springframework.cloud - - - - - - org.apache.linkis - linkis-bml-client - ${apache.linkis.version} - provided - true - - - gson - com.google.code.gson - - - - - org.apache.linkis - linkis-entrance - ${apache.linkis.version} + com.webank.wedatasphere.linkis + linkis-ujes-spark-entracne + 0.9.1 javax.validation @@ -155,29 +59,38 @@ - - org.yaml - snakeyaml - ${snakeyaml.version} + spring-cloud-commons + org.springframework.cloud + 2.0.0.RELEASE + + + spring-cloud-starter-openfeign + org.springframework.cloud + 2.0.0.RELEASE - + spring-cloud-starter-eureka org.springframework.cloud - spring-cloud-netflix-hystrix - ${spring.cloud.version} + 1.4.4.RELEASE org.springframework.cloud - spring-cloud-starter-openfeign - ${spring.cloud.version} + spring-cloud-netflix-core + 2.0.0.RELEASE + + + + org.apache.logging.log4j + log4j-api + 2.10.0 org.springframework.cloud - spring-cloud-starter-netflix-eureka-client - ${spring.cloud.version} + spring-cloud-netflix-eureka-client + 2.0.0.RELEASE @@ -194,22 +107,6 @@ org.apache.logging.log4j log4j-to-slf4j - - spring-beans - org.springframework - - - spring-context - org.springframework - - - spring-boot-starter-logging - org.springframework.boot - - - spring-aop - org.springframework - @@ -219,50 +116,28 @@ ${spring.boot.version} + + org.springframework.boot spring-boot-starter-cache ${spring.boot.version} - org.springframework.boot spring-boot-starter-thymeleaf ${spring.boot.version} - org.springframework.boot spring-boot-starter-mail ${spring.boot.version} - - org.springframework.boot - spring-boot-autoconfigure - ${spring.boot.version} - - - - org.springframework.boot - spring-boot-starter-aop - - - org.springframework.boot - spring-boot-starter-logging - - - spring-aop - org.springframework - - - ${spring.boot.version} - - com.github.ben-manes.caffeine caffeine - ${caffeine.version} + 2.6.2 @@ -275,25 +150,13 @@ org.springframework spring-context-support - ${spring.version} - - - - org.springframework - spring-aop - ${spring.version} - - - - org.springframework - spring-web - ${spring.version} + 5.0.8.RELEASE org.quartz-scheduler quartz - 2.3.2 + 2.3.0 com.zaxxer @@ -305,7 +168,7 @@ org.mybatis.spring.boot mybatis-spring-boot-starter - 2.1.2 + 1.3.2 @@ -314,41 +177,40 @@ 2.9.9 + + org.yaml + snakeyaml + 1.19 + + com.fasterxml.jackson.core jackson-databind - 2.11.0 + 2.9.6 com.google.guava guava - 28.2-jre - - - - - org.scala-lang - scala-library - ${scala.version} + 19.0 - org.scala-lang - scala-compiler - ${scala.version} - - - - org.scala-lang - scala-reflect - ${scala.version} + io.springfox + springfox-swagger2 + 2.6.1 + + + guava + com.google.guava + + - org.scala-lang - scalap - ${scala.version} + io.springfox + springfox-swagger-ui + 2.6.1 @@ -374,7 +236,6 @@ mysql-connector-java 5.1.44 - com.alibaba druid @@ -404,10 +265,14 @@ + + + + com.alibaba fastjson - 1.2.83 + 1.2.58 @@ -419,13 +284,13 @@ org.apache.poi poi - ${poi.version} + 3.9 org.apache.poi poi-ooxml - ${poi.version} + 3.9 @@ -591,18 +456,86 @@ commons-email 1.4 + + + org.apache.hive + hive-jdbc + 1.2.1 + + + jetty-all + org.eclipse.jetty.aggregate + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - - org.apache.maven.plugins - maven-compiler-plugin - - 1.8 - 1.8 - - org.apache.maven.plugins maven-dependency-plugin @@ -634,7 +567,6 @@ net.alchim31.maven scala-maven-plugin - 3.2.2 scala-compile-first @@ -659,6 +591,42 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/auth/ProjectAuth.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/auth/ProjectAuth.java deleted file mode 100644 index 614429bb1..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/auth/ProjectAuth.java +++ /dev/null @@ -1,5 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.auth; - -public interface ProjectAuth { - boolean isPorjectOwner(Long projectId, Long userId); -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/auth/ProjectAuthImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/auth/ProjectAuthImpl.java deleted file mode 100644 index 7e92e3a39..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/auth/ProjectAuthImpl.java +++ /dev/null @@ -1,31 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.auth; - -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; -import edp.davinci.dao.ProjectMapper; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -@Service -public class ProjectAuthImpl implements ProjectAuth { - - private static final Boolean CHECK_PROJECT_USER = (Boolean) CommonConfig.CHECK_PROJECT_USER().getValue(); - - @Autowired - private ProjectMapper projectMapper; - - @Override - public boolean isPorjectOwner(Long projectId, Long userId) { - if(CHECK_PROJECT_USER) { - Integer projectUserId = projectMapper.getProjectUserId(projectId); - if(null == projectUserId) { - return false; - } - if(projectUserId.intValue() == userId.intValue()) { - return true; - } else { - return false; - } - } - return true; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/CommonContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/CommonContant.java deleted file mode 100644 index 80ba9d162..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/CommonContant.java +++ /dev/null @@ -1,5 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class CommonContant { - public static String URL = "url"; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/DashboardContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/DashboardContant.java deleted file mode 100644 index 8751bce75..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/DashboardContant.java +++ /dev/null @@ -1,11 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class DashboardContant { - - public static String DASHBOARD_ID = "dashboardId"; - - public static String DASHBOARD = "dashboard"; - - public static String NAME = "name"; - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/DisplayContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/DisplayContant.java deleted file mode 100644 index 54d52416d..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/DisplayContant.java +++ /dev/null @@ -1,4 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class DisplayContant { -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/ProjectContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/ProjectContant.java deleted file mode 100644 index 14c5b0e58..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/ProjectContant.java +++ /dev/null @@ -1,4 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class ProjectContant { -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/SourceContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/SourceContant.java deleted file mode 100644 index c9c585831..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/SourceContant.java +++ /dev/null @@ -1,4 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class SourceContant { -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/UserContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/UserContant.java deleted file mode 100644 index 5acb97fe2..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/UserContant.java +++ /dev/null @@ -1,4 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class UserContant { -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/ViewContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/ViewContant.java deleted file mode 100644 index f615c0a4d..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/ViewContant.java +++ /dev/null @@ -1,6 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class ViewContant { - - public static String VIEW_ID = "viewId"; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/WidgetContant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/WidgetContant.java deleted file mode 100644 index e6b97b628..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/content/WidgetContant.java +++ /dev/null @@ -1,37 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.content; - -public class WidgetContant { - - public static String WIDGET_CONFIG_TEMPLATE = "{${context_info}\"data\":[],\"cols\":[],\"rows\":[]," + - "\"metrics\":[],\"filters\":[],\"color\":{\"title\":\"颜色\",\"type\":\"category\"," + - "\"value\":{\"all\":\"#509af2\"},\"items\":[]},\"chartStyles\":{\"pivot\":{\"fontFamily\":\"PingFang SC\"," + - "\"fontSize\":\"12\",\"color\":\"#666\",\"lineStyle\":\"solid\",\"lineColor\":\"#D9D9D9\"," + - "\"headerBackgroundColor\":\"#f7f7f7\"}},\"selectedChart\":1,\"pagination\":{\"pageNo\":0," + - "\"pageSize\":0,\"withPaging\":false,\"totalCount\":0},\"renderType\":\"clear\",\"orders\":[]," + - "\"mode\":\"pivot\",\"model\":${model_content},\"controls\":[],\"computed\":[],\"cache\":false,\"expired\":300,\"autoLoadData\":true}"; - - public static String WIDGET_CHART_CONFIG_TEMPLE = "{${context_info}\"data\":[],\"pagination\":{\"pageNo\":0," + - "\"pageSize\":0,\"totalCount\":0,\"withPaging\":false}," + - "\"cols\":[],\"rows\":[],\"metrics\":[],\"secondaryMetrics\":[]," + - "\"filters\":[],\"chartStyles\":{\"pivot\":{\"fontFamily\":\"PingFangSC\"," + - "\"fontSize\":\"12\",\"color\":\"#666\",\"lineStyle\":\"solid\"," + - "\"lineColor\":\"#D9D9D9\",\"headerBackgroundColor\":\"#f7f7f7\"}," + - "\"table\":{\"fontFamily\":\"PingFangSC\",\"fontSize\":\"12\"," + - "\"color\":\"#666\",\"lineStyle\":\"solid\",\"lineColor\":\"#D9D9D9\"," + - "\"headerBackgroundColor\":\"#f7f7f7\",\"headerConfig\":[],\"columnsConfig\":[]," + - "\"leftFixedColumns\":[],\"rightFixedColumns\":[],\"headerFixed\":true," + - "\"autoMergeCell\":false,\"bordered\":true,\"size\":\"small\",\"withPaging\":true," + - "\"pageSize\":\"5000\",\"withNoAggregators\":false}},\"selectedChart\":1," + - "\"orders\":[],\"mode\":\"chart\",\"model\":${model_content},\"controls\":[],\"computed\":[]," + - "\"cache\":false,\"expired\":300,\"autoLoadData\":true,\"query\":null}"; - - public static String WIDGET_ID = "widgetId"; - - public static String NAME = "name"; - - public static String WIDGET = "widget"; - - public static String WIDGETS = "widgets"; - - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/enums/ModuleEnum.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/enums/ModuleEnum.java deleted file mode 100644 index 3f3a78db6..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/enums/ModuleEnum.java +++ /dev/null @@ -1,30 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.enums; - -import java.util.Arrays; - -public enum ModuleEnum { - - DASHBOARD_PORTAL_IDS("dashboardPortalIds", "DASHBOARD ids"), - - DISPLAY_IDS("displayIds", "DISPLAY ids"), - - WIDGET_IDS("widgetIds", "WIDGET ids"), - - VIEW_IDS("viewIds", "VIEW ids"); - - private String name; - private String desc; - - ModuleEnum(String name, String desc) { - this.name = name; - this.desc = desc; - } - - public static ModuleEnum getEnum(String name) { - return Arrays.stream(ModuleEnum.values()).filter(e -> e.getName().equals(name)).findFirst().orElseThrow(NullPointerException::new); - } - - public String getName() { - return name; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/exception/VGErrorException.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/exception/VGErrorException.java index 3c748133c..c45f71303 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/exception/VGErrorException.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/exception/VGErrorException.java @@ -1,8 +1,27 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.exception; -import org.apache.linkis.common.exception.ErrorException; +import com.webank.wedatasphere.linkis.common.exception.ErrorException; -public class VGErrorException extends ErrorException { +/** + * Created by johnnwang on 2019/1/23. + */ +public class VGErrorException extends ErrorException { public VGErrorException(int errCode, String desc) { diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/ModifyHttpRequestWrapper.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/ModifyHttpRequestWrapper.java deleted file mode 100644 index 6198b4fc5..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/ModifyHttpRequestWrapper.java +++ /dev/null @@ -1,65 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.integration; - -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import javax.servlet.http.Cookie; -import javax.servlet.http.HttpServletRequest; -import javax.servlet.http.HttpServletRequestWrapper; -import java.util.*; - -/** - * 用来将DSS请求提供的cookie信息复制到visualis侧的cookie中 - */ -public class ModifyHttpRequestWrapper extends HttpServletRequestWrapper { - - Logger logger = LoggerFactory.getLogger(ModifyHttpRequestWrapper.class); - - private Map mapCookies; - - ModifyHttpRequestWrapper(HttpServletRequest request) { - super(request); - logger.info("Wrapper the request sent by DSS appconn."); - this.mapCookies = new HashMap<>(); - } - - void putCookie(String name, String value) { - logger.info("Wrapper the request sent by DSS appconn, And put cookie."); - this.mapCookies.put(name, value); - } - public Cookie[] getCookies() { - logger.info("Wrapper the request sent by DSS appconn, And get cookie."); - HttpServletRequest request = (HttpServletRequest) getRequest(); - Cookie[] cookies = request.getCookies(); - if (mapCookies == null || mapCookies.isEmpty()) { - return cookies; - } - if (cookies == null || cookies.length == 0) { - List cookieList = new LinkedList<>(); - for (Map.Entry entry : mapCookies.entrySet()) { - String key = entry.getKey(); - if (key != null && !"".equals(key)) { - cookieList.add(new Cookie(key, entry.getValue())); - } - } - if (cookieList.isEmpty()) { - return cookies; - } - return cookieList.toArray(new Cookie[cookieList.size()]); - } else { - List cookieList = new ArrayList<>(Arrays.asList(cookies)); - for (Map.Entry entry : mapCookies.entrySet()) { - String key = entry.getKey(); - if (key != null && !"".equals(key)) { - for (int i = 0; i < cookieList.size(); i++) { - if(cookieList.get(i).getName().equals(key)){ - cookieList.remove(i); - } - } - cookieList.add(new Cookie(key, entry.getValue())); - } - } - return cookieList.toArray(new Cookie[cookieList.size()]); - } - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisProjectAuthInterceptor.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisProjectAuthInterceptor.java deleted file mode 100644 index 3bdb5c823..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisProjectAuthInterceptor.java +++ /dev/null @@ -1,44 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.integration; - -import com.webank.wedatasphere.dss.standard.app.structure.project.plugin.filter.AbstractProjectAuthInterceptor; -import com.webank.wedatasphere.dss.standard.app.structure.project.plugin.filter.ProjectRequestType; -import org.springframework.stereotype.Component; - -import javax.servlet.http.HttpServletRequest; - -@Component -public class VisualisProjectAuthInterceptor extends AbstractProjectAuthInterceptor { - - @Override - public boolean isProjectRequest(HttpServletRequest request) { - return true; - } - - @Override - protected Object getForbiddenMsg(String s) { - return s; - } - - @Override - public String getProjectId(HttpServletRequest httpServletRequest) { - return null; - } - - @Override - public String getProjectName(HttpServletRequest httpServletRequest) { - return null; - } - - @Override - public ProjectRequestType getProjectRequestType(HttpServletRequest httpServletRequest) { - if("GET".equalsIgnoreCase(httpServletRequest.getMethod())){ - return ProjectRequestType.Access; - } else if ("PUT".equalsIgnoreCase(httpServletRequest.getMethod())) { - return ProjectRequestType.Edit; - } else if ("DELETE".equalsIgnoreCase(httpServletRequest.getMethod())){ - return ProjectRequestType.Delete; - } else { - return ProjectRequestType.Execute; - } - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisSSOFilterInitializer.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisSSOFilterInitializer.java deleted file mode 100644 index f35ada2cc..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisSSOFilterInitializer.java +++ /dev/null @@ -1,25 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.integration; - -import com.webank.wedatasphere.dss.standard.app.sso.origin.filter.spring.SpringOriginSSOPluginFilter; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; -import org.springframework.web.WebApplicationInitializer; - -import javax.servlet.FilterRegistration; -import javax.servlet.ServletContext; -import javax.servlet.ServletException; - -/** - * 将DSS提供的SSO Filter加入Visualis的HTTP请求处理的链路中 - * */ -public class VisualisSSOFilterInitializer implements WebApplicationInitializer { - Logger logger = LoggerFactory.getLogger(VisualisSSOFilterInitializer.class); - - @Override - public void onStartup(ServletContext servletContext) throws ServletException { - logger.info("Add DSS filter to the request processing of visualis."); - FilterRegistration.Dynamic myFilter = servletContext.addFilter("dssSSOFilter", SpringOriginSSOPluginFilter.class); - myFilter.addMappingForUrlPatterns(null, false, "/*"); - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisUserInterceptor.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisUserInterceptor.java deleted file mode 100644 index 8638f98e8..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/integration/VisualisUserInterceptor.java +++ /dev/null @@ -1,43 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.integration; - -import com.webank.wedatasphere.dss.standard.app.sso.builder.DssMsgBuilderOperation; -import com.webank.wedatasphere.dss.standard.app.sso.plugin.filter.HttpRequestUserInterceptor; -import org.springframework.stereotype.Component; - -import javax.servlet.http.HttpServletRequest; -import java.util.Map; - -/** - * 用来操作HTTP session中的用户信息 - * */ -@Component -public class VisualisUserInterceptor implements HttpRequestUserInterceptor { - - public void addUserToSession(String username, HttpServletRequest httpServletRequest) { - httpServletRequest.setAttribute("dss-user", username); - } - - public HttpServletRequest wrapRequest(DssMsgBuilderOperation.DSSMsg dssMsg, HttpServletRequest httpServletRequest) { - ModifyHttpRequestWrapper requestWrapper = new ModifyHttpRequestWrapper(httpServletRequest); - for (Map.Entry cookies : dssMsg.getCookies().entrySet()) { - requestWrapper.putCookie(cookies.getKey(), cookies.getValue()); - } - return requestWrapper; - } - - @Override - public boolean isUserExistInSession(HttpServletRequest httpServletRequest) { - return httpServletRequest.getAttribute("dss-user") != null; - } - - @Override - public String getUser(HttpServletRequest httpServletRequest) { - return (String) httpServletRequest.getAttribute("dss-user"); - } - - - @Override - public HttpServletRequest addUserToRequest(String user, HttpServletRequest httpServletRequest){ - return httpServletRequest; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/DWCResultInfo.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/DWCResultInfo.java index aa62841b7..b14844f66 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/DWCResultInfo.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/DWCResultInfo.java @@ -1,13 +1,32 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.model; +/** + * Created by johnnwang on 2019/1/24. + */ public class DWCResultInfo { - private String fileName; + private String fileName; private String executionCode; private String resultPath; - private int resultNumber; + private int resultNumber; private String runType; diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/EmailInfo.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/EmailInfo.java index 14ac773fe..9f6990b64 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/EmailInfo.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/EmailInfo.java @@ -1,5 +1,22 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.model; + public class EmailInfo { private String cc; private String to; diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/HiveSource.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/HiveSource.java index 7793fa248..d712d3e72 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/HiveSource.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/HiveSource.java @@ -1,10 +1,31 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.model; + import com.alibaba.fastjson.JSONObject; import edp.davinci.model.Source; import lombok.Data; import lombok.extern.slf4j.Slf4j; +/** + * created by cooperyang on 2019/1/24 + * Description: + */ @Slf4j @Data public class HiveSource extends Source { diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/PaginateWithExecStatus.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/PaginateWithExecStatus.java deleted file mode 100644 index 430767991..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/PaginateWithExecStatus.java +++ /dev/null @@ -1,38 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model; - -import org.apache.linkis.scheduler.queue.SchedulerEventState; -import edp.core.model.PaginateWithQueryColumns; -import lombok.Data; -import lombok.EqualsAndHashCode; - -@Data -@EqualsAndHashCode(callSuper = false) -public class PaginateWithExecStatus extends PaginateWithQueryColumns { - private String execId = ""; - private String status = SchedulerEventState.Inited().toString(); - private float progress = -1; - - public String getExecId() { - return execId; - } - - public void setExecId(String execId) { - this.execId = execId; - } - - public String getStatus() { - return status; - } - - public void setStatus(String status) { - this.status = status; - } - - public float getProgress() { - return progress; - } - - public void setProgress(float progress) { - this.progress = progress; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveDBModel.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveDBModel.java deleted file mode 100644 index 61b5c5936..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveDBModel.java +++ /dev/null @@ -1,41 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model.hivemodel; - -import java.util.List; - -public class HiveDBModel extends HiveModel { - public static class HiveDB { - private String dbName; - - public String getDbName() { - return dbName; - } - - public void setDbName(String dbName) { - this.dbName = dbName; - } - } - - public static class Data { - private List dbs; - - public List getDbs() { - return dbs; - } - - public void setDbs(List dbs) { - this.dbs = dbs; - } - } - - private Data data; - - - public Data getData() { - return data; - } - - public void setData(Data data) { - this.data = data; - } -} - diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveModel.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveModel.java deleted file mode 100644 index 0f4250daf..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveModel.java +++ /dev/null @@ -1,32 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model.hivemodel; - -public abstract class HiveModel { - private String method; - private int status; - private String message; - - public String getMethod() { - return method; - } - - public void setMethod(String method) { - this.method = method; - } - - public int getStatus() { - return status; - } - - public void setStatus(int status) { - this.status = status; - } - - public String getMessage() { - return message; - } - - public void setMessage(String message) { - this.message = message; - } -} - diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/ExportedProject.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/ExportedProject.java deleted file mode 100644 index f04c86abc..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/ExportedProject.java +++ /dev/null @@ -1,17 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model.optmodel; - -import edp.davinci.model.*; -import lombok.Data; - -import java.util.List; - -@Data -public class ExportedProject { - String name; - List sources; - List views; - List widgets; - List displays; - List dashboardPortals; - -} \ No newline at end of file diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/IdCatalog.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/IdCatalog.java deleted file mode 100644 index dcef940b7..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/IdCatalog.java +++ /dev/null @@ -1,89 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model.optmodel; - -import com.google.common.collect.Maps; - -import java.util.Map; - -public class IdCatalog { - private Map source = Maps.newHashMap(); - private Map view = Maps.newHashMap(); - private Map widget = Maps.newHashMap(); - private Map display = Maps.newHashMap(); - private Map displaySlide = Maps.newHashMap(); - private Map memDisplaySlideWidget = Maps.newHashMap(); - private Map dashboardPortal = Maps.newHashMap(); - private Map dashboard = Maps.newHashMap(); - private Map memDashboardWidget = Maps.newHashMap(); - - public Map getSource() { - return source; - } - - public void setSource(Map source) { - this.source = source; - } - - public Map getView() { - return view; - } - - public void setView(Map view) { - this.view = view; - } - - public Map getWidget() { - return widget; - } - - public void setWidget(Map widget) { - this.widget = widget; - } - - public Map getDisplay() { - return display; - } - - public void setDisplay(Map display) { - this.display = display; - } - - public Map getDisplaySlide() { - return displaySlide; - } - - public void setDisplaySlide(Map displaySlide) { - this.displaySlide = displaySlide; - } - - public Map getMemDisplaySlideWidget() { - return memDisplaySlideWidget; - } - - public void setMemDisplaySlideWidget(Map memDisplaySlideWidget) { - this.memDisplaySlideWidget = memDisplaySlideWidget; - } - - public Map getDashboardPortal() { - return dashboardPortal; - } - - public void setDashboardPortal(Map dashboardPortal) { - this.dashboardPortal = dashboardPortal; - } - - public Map getDashboard() { - return dashboard; - } - - public void setDashboard(Map dashboard) { - this.dashboard = dashboard; - } - - public Map getMemDashboardWidget() { - return memDashboardWidget; - } - - public void setMemDashboardWidget(Map memDashboardWidget) { - this.memDashboardWidget = memDashboardWidget; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDashboard.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDashboard.java deleted file mode 100644 index 9d512c4b1..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDashboard.java +++ /dev/null @@ -1,13 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model.optmodel; - -import edp.davinci.model.Dashboard; -import edp.davinci.model.MemDashboardWidget; -import lombok.Data; - -import java.util.List; - -@Data -public class PlainDashboard { - Dashboard dashboard; - List memDashboardWidgets; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDashboardPortal.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDashboardPortal.java deleted file mode 100644 index 4e08139c0..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDashboardPortal.java +++ /dev/null @@ -1,12 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model.optmodel; - -import edp.davinci.model.DashboardPortal; -import lombok.Data; - -import java.util.List; - -@Data -public class PlainDashboardPortal { - DashboardPortal dashboardPortal; - List dashboards; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDisplay.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDisplay.java deleted file mode 100644 index f62ae94ee..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/optmodel/PlainDisplay.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.model.optmodel; - -import edp.davinci.model.Display; -import edp.davinci.model.DisplaySlide; -import edp.davinci.model.MemDisplaySlideWidget; -import lombok.Data; - -import java.util.List; - -@Data -public class PlainDisplay { - Display display; - DisplaySlide displaySlide; - List memDisplaySlideWidgets; - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/generator/StatementGenerator.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/generator/StatementGenerator.java deleted file mode 100644 index 091efde3a..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/generator/StatementGenerator.java +++ /dev/null @@ -1,13 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.generator; - -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import edp.davinci.dto.viewDto.DistinctParam; -import edp.davinci.dto.viewDto.ViewExecuteParam; -import edp.davinci.model.User; - -public interface StatementGenerator { - - String generate(VirtualView virtualView, ViewExecuteParam executeParam, User user); - - String generateDistinct(VirtualView virtualView, DistinctParam param, User user); -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/generator/VirtualSqlStatementGenerator.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/generator/VirtualSqlStatementGenerator.java deleted file mode 100644 index 73f5e5e19..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/generator/VirtualSqlStatementGenerator.java +++ /dev/null @@ -1,169 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.generator; - -import com.alibaba.druid.util.StringUtils; -import com.alibaba.fastjson.JSON; -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import com.webank.wedatasphere.dss.visualis.query.utils.ChartUtils; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; -import edp.core.consts.Consts; -import edp.core.utils.CollectionUtils; -import edp.core.utils.SqlUtils; -import edp.davinci.core.common.Constants; -import edp.davinci.core.model.SqlEntity; -import edp.davinci.core.model.SqlFilter; -import edp.davinci.core.utils.SqlParseUtils; -import edp.davinci.dao.SourceMapper; -import edp.davinci.dto.viewDto.DistinctParam; -import edp.davinci.dto.viewDto.ViewExecuteParam; -import edp.davinci.model.Source; -import edp.davinci.model.SqlVariable; -import edp.davinci.model.User; -import lombok.extern.slf4j.Slf4j; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.beans.factory.annotation.Value; -import org.springframework.stereotype.Component; -import org.stringtemplate.v4.ST; -import org.stringtemplate.v4.STGroup; -import org.stringtemplate.v4.STGroupFile; - -import java.util.ArrayList; -import java.util.HashSet; -import java.util.List; -import java.util.Set; - -import static edp.core.consts.Consts.MINUS; - -@Slf4j -@Component -public class VirtualSqlStatementGenerator implements StatementGenerator { - - @Autowired - private SqlUtils sqlUtils; - - @Autowired - private SqlParseUtils sqlParseUtils; - - @Autowired - private SourceMapper sourceMapper; - - @Value("${sql_template_delimiter:$}") - private String sqlTempDelimiter; - - @Override - public String generate(VirtualView virtualView, ViewExecuteParam executeParam, User user) { - ChartUtils.processViewExecuteParam(executeParam); - List variables = virtualView.getVariables(); - //解析sql - SqlEntity sqlEntity = sqlParseUtils.parseSql(virtualView.getSql(), variables, sqlTempDelimiter); - //列权限(只记录被限制访问的字段) - Set excludeColumns = new HashSet<>(); - String srcSql = sqlParseUtils.replaceParams(sqlEntity.getSql(), sqlEntity.getQuaryParams(), sqlEntity.getAuthParams(), sqlTempDelimiter, user); - - List executeSqlList = sqlParseUtils.getSqls(srcSql, false); - List querySqlList = sqlParseUtils.getSqls(srcSql, true); - - if (!CollectionUtils.isEmpty(querySqlList)) { - - if (null != executeParam - && null != executeParam.getCache() - && executeParam.getCache() - && executeParam.getExpired() > 0L) { - - StringBuilder slatBuilder = new StringBuilder(); - slatBuilder.append(executeParam.getPageNo()); - slatBuilder.append(MINUS); - slatBuilder.append(executeParam.getLimit()); - slatBuilder.append(MINUS); - slatBuilder.append(executeParam.getPageSize()); - excludeColumns.forEach(slatBuilder::append); - - } - - } - buildQuerySql(querySqlList, executeParam); - return String.join(Consts.SEMICOLON, querySqlList); - } - - @Override - public String generateDistinct(VirtualView virtualView, DistinctParam param, User user) { - List variables = virtualView.getVariables(); - SqlEntity sqlEntity = sqlParseUtils.parseSql(virtualView.getSql(), variables, sqlTempDelimiter); - String srcSql = sqlParseUtils.replaceParams(sqlEntity.getSql(), sqlEntity.getQuaryParams(), sqlEntity.getAuthParams(), sqlTempDelimiter, user); - - Source source = sourceMapper.getById(VisualisUtils.getHiveDataSourceId()); - - SqlUtils sqlUtils = this.sqlUtils.init(source); - - List executeSqlList = sqlParseUtils.getSqls(srcSql, false); - List querySqlList = sqlParseUtils.getSqls(srcSql, true); - if (!CollectionUtils.isEmpty(querySqlList)) { - String cacheKey = null; - if (null != param) { - STGroup stg = new STGroupFile(Constants.SQL_TEMPLATE); - ST st = stg.getInstanceOf("queryDistinctSql"); - st.add("columns", param.getColumns()); - st.add("filters", convertFilters(param.getFilters(), source)); - st.add("sql", querySqlList.get(querySqlList.size() - 1)); - st.add("keywordPrefix", SqlUtils.getKeywordPrefix(source.getJdbcUrl(), source.getDbVersion())); - st.add("keywordSuffix", SqlUtils.getKeywordSuffix(source.getJdbcUrl(), source.getDbVersion())); - - String sql = st.render(); - querySqlList.set(querySqlList.size() - 1, sql); - } - } - return String.join(Consts.SEMICOLON, querySqlList); - } - - - public void buildQuerySql(List querySqlList, ViewExecuteParam executeParam) { - Source source = sourceMapper.getById(VisualisUtils.getHiveDataSourceId()); - if (null != executeParam) { - //构造参数, 原有的被传入的替换 - STGroup stg = new STGroupFile(Constants.SQL_TEMPLATE); - ST st = stg.getInstanceOf("querySql"); - st.add("nativeQuery", executeParam.isNativeQuery()); - st.add("groups", executeParam.getGroups()); - - if (executeParam.isNativeQuery()) { - st.add("aggregators", executeParam.getAggregators()); - } else { - st.add("aggregators", executeParam.getAggregators(source.getJdbcUrl(), source.getDbVersion())); - } - st.add("orders", executeParam.getOrders(source.getJdbcUrl(), source.getDbVersion())); - st.add("filters", convertFilters(executeParam.getFilters(), source)); - st.add("keywordPrefix", sqlUtils.getKeywordPrefix(source.getJdbcUrl(), source.getDbVersion())); - st.add("keywordSuffix", sqlUtils.getKeywordSuffix(source.getJdbcUrl(), source.getDbVersion())); - - for (int i = 0; i < querySqlList.size(); i++) { - st.add("sql", querySqlList.get(i)); - querySqlList.set(i, st.render()); - } - - } - } - - public List convertFilters(List filterStrs, Source source) { - List whereClauses = new ArrayList<>(); - List filters = new ArrayList<>(); - try { - if (null == filterStrs || filterStrs.isEmpty()) { - return null; - } - - for (String str : filterStrs) { - SqlFilter obj = JSON.parseObject(str, SqlFilter.class); - if (!StringUtils.isEmpty(obj.getName())) { - obj.setName(ViewExecuteParam.getField(obj.getName(), source.getJdbcUrl(), source.getDbVersion())); - } - filters.add(obj); - } - filters.forEach(filter -> whereClauses.add(SqlFilter.dealFilter(filter))); - - }catch (Exception e){ - log.error("convertFilters error . filterStrs = {}, source = {}, filters = {} , whereClauses = {} ", - filterStrs, source, filters, whereClauses); - throw e; - } - return whereClauses; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/ContextSourceInitializer.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/ContextSourceInitializer.java deleted file mode 100644 index 461660033..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/ContextSourceInitializer.java +++ /dev/null @@ -1,39 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.initializer; - -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import com.webank.wedatasphere.dss.visualis.query.utils.QueryUtils; -import org.apache.linkis.common.exception.ErrorException; -import org.apache.linkis.cs.common.exception.CSErrorException; -import edp.davinci.model.User; -import org.apache.commons.lang.StringUtils; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; -import org.springframework.stereotype.Component; - -@Component -public class ContextSourceInitializer implements SourceInitializer{ - - private static final Logger logger = LoggerFactory.getLogger(ContextSourceInitializer.class); - - @Override - public SourceInitJob init(VirtualView virtualView, User user) throws ErrorException { - SourceInitJob sourceInitJob = null; - try { - QueryUtils.refreshFromContext(virtualView); - } catch (CSErrorException e) { - logger.error("Failed to refresh metadata:", e); - } - - String tableName = virtualView.getSource().getDataSourceContent().get("tableName"); - String dbName = virtualView.getSource().getDataSourceContent().get("dbName"); - String fullName = StringUtils.isBlank(dbName) ? tableName : dbName + "." + tableName; - String selectFrom = "select * from " + fullName; - virtualView.setSql(selectFrom); - return sourceInitJob; - } - - @Override - public String getType() { - return "context"; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/ResultSetSourceInitializer.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/ResultSetSourceInitializer.java deleted file mode 100644 index c2977cc62..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/ResultSetSourceInitializer.java +++ /dev/null @@ -1,28 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.initializer; - -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import com.webank.wedatasphere.dss.visualis.query.utils.QueryUtils; -import com.webank.wedatasphere.dss.visualis.res.ResultHelper; -import com.webank.wedatasphere.dss.visualis.ujes.UJESJob; -import edp.davinci.model.User; -import org.springframework.stereotype.Component; - - -@Component -public class ResultSetSourceInitializer implements SourceInitializer{ - - @Override - public SourceInitJob init(VirtualView virtualView, User user) { - String resultLocation = ResultHelper.getSchemaPath(virtualView.getSource().getDataSourceContent().get("resultLocation")); - String tempViewName = "tmp_res_" + (virtualView.getName().hashCode() & Integer.MAX_VALUE); - String selectFrom = "select * from " + tempViewName; - virtualView.setSql(selectFrom); - return new SourceInitJob(QueryUtils.getCreateTempViewScala(tempViewName, resultLocation), UJESJob.SCALA_TYPE()); - } - - @Override - public String getType() { - return "resultset"; - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/SourceInitJob.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/SourceInitJob.java deleted file mode 100644 index 19e887833..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/SourceInitJob.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.initializer; - -import lombok.Data; - -@Data -public class SourceInitJob { - - String scriptContent; - String scriptType; - - public SourceInitJob(String scriptContent, String scriptType) { - this.scriptContent = scriptContent; - this.scriptType = scriptType; - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/SourceInitializer.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/SourceInitializer.java deleted file mode 100644 index a245c7c4c..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/SourceInitializer.java +++ /dev/null @@ -1,11 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.initializer; - -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import org.apache.linkis.common.exception.ErrorException; -import edp.davinci.model.User; - -public interface SourceInitializer { - - SourceInitJob init(VirtualView virtualView, User user) throws ErrorException; - String getType(); -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/UrlSourceInitializer.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/UrlSourceInitializer.java deleted file mode 100644 index 26c77c596..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/initializer/UrlSourceInitializer.java +++ /dev/null @@ -1,53 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.initializer; - -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import com.webank.wedatasphere.dss.visualis.query.utils.QueryUtils; -import com.webank.wedatasphere.dss.visualis.ujes.UJESJob; -import edp.davinci.model.User; -import org.apache.commons.lang.StringUtils; -import org.apache.http.client.CookieStore; -import org.apache.http.client.methods.CloseableHttpResponse; -import org.apache.http.client.methods.HttpGet; -import org.apache.http.impl.client.BasicCookieStore; -import org.apache.http.impl.client.CloseableHttpClient; -import org.apache.http.impl.client.HttpClientBuilder; -import org.apache.http.util.EntityUtils; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; -import org.springframework.stereotype.Component; - -import java.io.IOException; - -@Component -public class UrlSourceInitializer implements SourceInitializer { - - private static final Logger logger = LoggerFactory.getLogger(UrlSourceInitializer.class); - - @Override - public SourceInitJob init(VirtualView virtualView, User user) { - SourceInitJob sourceInitJob = null; - CookieStore cookieStore = new BasicCookieStore(); - CloseableHttpClient httpClient = HttpClientBuilder.create().setDefaultCookieStore(cookieStore).build(); - HttpGet httpGet = new HttpGet(virtualView.getSource().getDataSourceContent().get("url")); - String dolphinResult = ""; - try{ - CloseableHttpResponse response = httpClient.execute(httpGet); - dolphinResult = EntityUtils.toString(response.getEntity(), "UTF-8"); - }catch(IOException e){ - logger.error("failed to download url data source, reason:" , e); - } - if(StringUtils.isNotBlank(dolphinResult)){ - String tempViewName = "tmp_url_" + virtualView.getName(); - sourceInitJob = new SourceInitJob(QueryUtils.getCreateTempViewScala(tempViewName, dolphinResult), UJESJob.SCALA_TYPE()); - String selectFrom = "select * from " + tempViewName; - virtualView.setSql(selectFrom); - } - return sourceInitJob; - } - - @Override - public String getType() { - return "url"; - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/model/VirtualSource.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/model/VirtualSource.java deleted file mode 100644 index 4bddd054b..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/model/VirtualSource.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.model; - -import edp.davinci.model.Source; -import lombok.Data; - -import java.util.Map; - -@Data -public class VirtualSource extends Source { - - String engineType; - String dataSourceType; - Map dataSourceContent; - String creator; - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/model/VirtualView.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/model/VirtualView.java deleted file mode 100644 index 56e842243..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/model/VirtualView.java +++ /dev/null @@ -1,12 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.model; - -import edp.davinci.model.View; -import lombok.Data; - -import java.util.Map; - -@Data -public class VirtualView extends View { - VirtualSource source; - Map params; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/service/VirtualViewQueryService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/service/VirtualViewQueryService.java deleted file mode 100644 index 52130ff15..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/service/VirtualViewQueryService.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.service; - -import edp.core.model.Paginate; -import edp.davinci.dto.viewDto.DistinctParam; -import edp.davinci.dto.viewDto.ViewExecuteParam; -import edp.davinci.model.User; - -import java.util.List; -import java.util.Map; - -public interface VirtualViewQueryService { - - Paginate> getData(ViewExecuteParam executeParam, User user, boolean async) throws Exception; - - List> getDistinctValue(DistinctParam param, User user) throws Exception; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/service/VirtualViewQueryServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/service/VirtualViewQueryServiceImpl.java deleted file mode 100644 index d5f7bf1ff..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/service/VirtualViewQueryServiceImpl.java +++ /dev/null @@ -1,123 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.service; - -import com.google.common.collect.Maps; -import com.google.common.collect.Sets; -import com.webank.wedatasphere.dss.visualis.query.generator.VirtualSqlStatementGenerator; -import com.webank.wedatasphere.dss.visualis.query.initializer.*; -import com.webank.wedatasphere.dss.visualis.query.utils.QueryUtils; -import com.webank.wedatasphere.dss.visualis.ujes.UJESJob; -import edp.core.model.Paginate; -import edp.core.utils.SqlUtils; -import edp.davinci.dto.viewDto.DistinctParam; -import edp.davinci.dto.viewDto.ViewExecuteParam; -import edp.davinci.model.User; -import org.springframework.beans.factory.InitializingBean; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; - -import java.util.List; -import java.util.Map; - -@Component -public class VirtualViewQueryServiceImpl implements VirtualViewQueryService, InitializingBean { - - @Autowired - SqlUtils sqlUtils; - - @Autowired - VirtualSqlStatementGenerator virtualSqlStatementGenerator; - - @Autowired - ResultSetSourceInitializer resultSetSourceInitializer; - - @Autowired - ContextSourceInitializer contextSourceInitializer; - - @Autowired - UrlSourceInitializer urlSourceInitializer; - - Map typeToSourceInitializer; - - @Override - public void afterPropertiesSet() { - sqlUtils = sqlUtils.init(null); - typeToSourceInitializer = Maps.newHashMap(); - typeToSourceInitializer.put(resultSetSourceInitializer.getType(), resultSetSourceInitializer); - typeToSourceInitializer.put(contextSourceInitializer.getType(), contextSourceInitializer); - typeToSourceInitializer.put(urlSourceInitializer.getType(), urlSourceInitializer); - } - - public Paginate> getData(ViewExecuteParam executeParam, User user, boolean async) throws Exception { - SourceInitializer sourceInitializer = typeToSourceInitializer.get(executeParam.getView().getSource().getDataSourceType()); - SourceInitJob sourceInitJob = sourceInitializer.init(executeParam.getView(), user); - String queryScripts = virtualSqlStatementGenerator.generate(executeParam.getView(), executeParam, user); - String contextId = executeParam.getView().getSource().getDataSourceContent().get("contextId"); - String nodeName = executeParam.getView().getSource().getDataSourceContent().get("nodeName"); - contextId = contextId == null ? "" : contextId; - String jobType = UJESJob.SQL_TYPE(); - if (sourceInitJob != null) { - queryScripts = QueryUtils.getQueryTempViewScala(queryScripts, sourceInitJob.getScriptContent()); - jobType = sourceInitJob.getScriptType(); - } - String linkisJob = QueryUtils.getLinkisSparkJob( - user, - executeParam.getView().getName(), - queryScripts, - jobType, - executeParam.getView().getSource().getCreator(), - executeParam.getView().getSource().getEngineType(), - contextId, - nodeName, - executeParam.getCache(), - executeParam.getExpired(), - executeParam.getCache(), - executeParam.getExpired()); - if (async) { - return sqlUtils.asyncQuery4Exec( - linkisJob, - executeParam.getPageNo(), - executeParam.getPageSize(), - executeParam.getTotalCount(), - executeParam.getLimit(), - Sets.newHashSet()); - } else { - return sqlUtils.syncQuery4Paginate( - linkisJob, - executeParam.getPageNo(), - executeParam.getPageSize(), - executeParam.getTotalCount(), - executeParam.getLimit(), - Sets.newHashSet() - ); - } - } - - @Override - public List> getDistinctValue(DistinctParam param, User user) throws Exception { - SourceInitializer sourceInitializer = typeToSourceInitializer.get(param.getView().getSource().getDataSourceType()); - SourceInitJob sourceInitJob = sourceInitializer.init(param.getView(), user); - String queryScripts = virtualSqlStatementGenerator.generateDistinct(param.getView(), param, user); - String contextId = param.getView().getSource().getDataSourceContent().get("contextId"); - contextId = contextId == null ? "" : contextId; - String nodeName = param.getView().getSource().getDataSourceContent().get("nodeName"); - String jobType = UJESJob.SQL_TYPE(); - if (sourceInitJob != null) { - queryScripts = QueryUtils.getQueryTempViewScala(queryScripts, sourceInitJob.getScriptContent()); - jobType = sourceInitJob.getScriptType(); - } - String linkisJob = QueryUtils.getLinkisSparkJob( - user, - param.getView().getName(), - queryScripts, jobType, - param.getView().getSource().getCreator(), - param.getView().getSource().getEngineType(), - contextId, - nodeName, - param.getCache(), - param.getExpired(), - param.getCache(), - param.getExpired()); - return sqlUtils.query4List(linkisJob, -1); - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/ChartUtils.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/ChartUtils.java deleted file mode 100644 index c4fc7d794..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/ChartUtils.java +++ /dev/null @@ -1,28 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.utils; - -import com.google.common.collect.Iterables; -import edp.core.utils.CollectionUtils; -import edp.davinci.dto.viewDto.ViewExecuteParam; - -import java.util.Iterator; - -public class ChartUtils { - - public static String RELATION_GRAPH = "relation_graph"; - - public static void processViewExecuteParam(ViewExecuteParam viewExecuteParam){ - if(RELATION_GRAPH.equalsIgnoreCase(viewExecuteParam.getChartType())){ - String firstGroup = Iterables.getFirst(viewExecuteParam.getGroups(), null); - if(firstGroup != null && !CollectionUtils.isEmpty(viewExecuteParam.getFilters())){ - Iterator iterator = viewExecuteParam.getFilters().iterator(); - while (iterator.hasNext()){ - if(iterator.next().contains(firstGroup)){ - iterator.remove(); - break; - } - } - } - } - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/EnvLimitUtils.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/EnvLimitUtils.java deleted file mode 100644 index 165c157d2..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/EnvLimitUtils.java +++ /dev/null @@ -1,24 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.utils; - -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; - -public class EnvLimitUtils { - - public static final String BDP_PROD = "BDP_PROD"; - public static final String BDAP_PROD = "BDAP_PROD"; - public static final String ERROR_MESSAGE = "BDP生产环境不允许进行此操作!"; - - public static boolean notPermitted() { - if (BDP_PROD.equals(CommonConfig.DEPLOY_ENV().getValue())) { - return true; - } - return false; - } - - public static boolean isProdEnv() { - if (BDAP_PROD.equals(CommonConfig.DEPLOY_ENV().getValue())) { - return true; - } - return false; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/JdbcAsyncUtils.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/JdbcAsyncUtils.java deleted file mode 100644 index c3bbe47f7..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/JdbcAsyncUtils.java +++ /dev/null @@ -1,53 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.utils; - -import com.google.common.cache.Cache; -import com.google.common.cache.CacheBuilder; -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; -import com.webank.wedatasphere.dss.visualis.model.PaginateWithExecStatus; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; -import org.apache.linkis.rpc.Sender; -import org.apache.linkis.scheduler.queue.SchedulerEventState; -import edp.core.model.PaginateWithQueryColumns; - -import java.util.UUID; -import java.util.concurrent.TimeUnit; - -public class JdbcAsyncUtils { - - static String EXEC_ID_PREFIX = "JDBC_"; - - static Cache jdbcCache = CacheBuilder.newBuilder() - .expireAfterWrite((Long) CommonConfig.JDBC_CACHE_FLUSH_WRITE().getValue(), TimeUnit.SECONDS) - .build(); - - public static void putResult(String execId, PaginateWithQueryColumns resultSet) { - jdbcCache.put(execId, resultSet); - } - - public static PaginateWithQueryColumns getResult(String execId) { - String instance = VisualisUtils.getInstanceByHAExecId(execId); - if (instance.equals(Sender.getThisInstance())) { - PaginateWithQueryColumns paginateWithQueryColumns = jdbcCache.getIfPresent(execId); - jdbcCache.invalidate(execId); - return paginateWithQueryColumns; - } else { - return VisualisUtils.getJDBCResult(instance, execId); - } - } - - public static String generateExecId() { - return VisualisUtils.getHAExecId(EXEC_ID_PREFIX + UUID.randomUUID().toString()); - } - - public static boolean isJdbcExecId(String execId) { - return execId.startsWith(EXEC_ID_PREFIX); - } - - public static PaginateWithExecStatus getJdbcProgress(String execId) { - PaginateWithExecStatus paginateWithExecStatus = new PaginateWithExecStatus(); - paginateWithExecStatus.setExecId(execId); - paginateWithExecStatus.setProgress(1L); - paginateWithExecStatus.setStatus(SchedulerEventState.Succeed().toString()); - return paginateWithExecStatus; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/QueryUtils.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/QueryUtils.java deleted file mode 100644 index cf7467d5c..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/query/utils/QueryUtils.java +++ /dev/null @@ -1,160 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.query.utils; - -import com.google.common.collect.Iterables; -import com.google.common.collect.Lists; -import com.google.common.collect.Maps; -import com.webank.wedatasphere.dss.visualis.query.model.VirtualSource; -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import com.webank.wedatasphere.dss.visualis.res.ResultHelper; -import com.webank.wedatasphere.dss.visualis.ujes.UJESJob; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; -import org.apache.linkis.adapt.LinkisUtils; -import org.apache.linkis.common.exception.ErrorException; -import org.apache.linkis.cs.client.service.CSTableService; -import org.apache.linkis.cs.client.utils.SerializeHelper; -import org.apache.linkis.cs.common.entity.metadata.CSTable; -import org.apache.linkis.cs.common.entity.metadata.Column; -import org.apache.linkis.cs.common.entity.source.ContextKeyValue; -import org.apache.linkis.cs.common.utils.CSCommonUtils; -import org.apache.linkis.server.BDPJettyServerHelper; -import edp.davinci.common.model.VisualViewModel; -import edp.davinci.model.User; -import org.apache.commons.lang.StringUtils; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.Base64; -import java.util.HashMap; -import java.util.List; -import java.util.Map; - -public class QueryUtils { - - private static final Logger logger = LoggerFactory.getLogger(QueryUtils.class); - - - final static Base64.Decoder decoder = Base64.getDecoder(); - final static Base64.Encoder encoder = Base64.getEncoder(); - - public static String getLinkisSparkJob(User user, String scriptName, String script, String jobType, String creator, String engine, String contextId, String nodeName, Boolean cache, Long cacheExpireAfter, Boolean readFromCache, Long readCacheBefore) { - HashMap sourceMap = Maps.newHashMap(); - sourceMap.put("fileName", scriptName); - UJESJob ujesJob = new UJESJob(script, user.getName(), jobType, sourceMap, creator, engine, nodeName, contextId, cache, cacheExpireAfter, readFromCache, readCacheBefore); - return BDPJettyServerHelper.gson().toJson(ujesJob); - } - - public static List getFromContext(String encodedContextId, String nodeName) throws ErrorException { - List virtualViews = Lists.newArrayList(); - String contextId = decodeContextId(encodedContextId); - CSTableService csTableService = CSTableService.getInstance(); - List csTables = csTableService.searchUpstreamTableKeyValue(contextId, nodeName); - for (ContextKeyValue contextKeyValue : csTables) { - VirtualView virtualView = getVirtualViewByContextKeyValue(contextId, nodeName, contextKeyValue); - virtualViews.add(virtualView); - } - return virtualViews; - } - - public static VirtualView getExactFromContext(String encodedContextId, String nodeName) throws ErrorException { - VirtualView virtualView = null; - String contextId = decodeContextId(encodedContextId); - CSTableService csTableService = CSTableService.getInstance(); - List csTables = csTableService.searchUpstreamTableKeyValue(contextId, nodeName); - for (ContextKeyValue contextKeyValue : csTables) { - if (nodeName.equals(StringUtils.substringBetween(contextKeyValue.getContextKey().getKey(), CSCommonUtils.NODE_PREFIX, "."))) { - virtualView = getVirtualViewByContextKeyValue(contextId, nodeName, contextKeyValue); - return virtualView; - } - } - return virtualView; - } - - public static VirtualView refreshFromContext(VirtualView virtualView) throws ErrorException { - String contextId = virtualView.getSource().getDataSourceContent().get("contextId"); - String contextKey = virtualView.getSource().getDataSourceContent().get("contextKey"); - CSTableService csTableService = CSTableService.getInstance(); - CSTable csTable = csTableService.getCSTable(contextId, contextKey); - if (csTable == null) { - logger.info("use nodeName to refresh context, because no metadata found by contextId[" + contextId + "] and contextKey[" + contextKey + "]"); - String nodeName = virtualView.getSource().getDataSourceContent().get("nodeName"); - ContextKeyValue contextKeyValue = Iterables.getFirst(csTableService.searchUpstreamTableKeyValue(contextId, nodeName), null); - if (contextKeyValue == null) { - logger.warn("no metadata found by contextId[" + contextId + "] and nodeName[" + nodeName + "]"); - return virtualView; - } - logger.info("found metadata by nodeName[" + nodeName + "] , key[" + contextKeyValue.getContextKey().getKey() + "]"); - String nodeNameFromKey = StringUtils.substringBetween(contextKeyValue.getContextKey().getKey(), CSCommonUtils.NODE_PREFIX, "."); - if (!nodeName.equals(nodeNameFromKey)) { - logger.warn("metadata node does not match! expected:[" + nodeName + "], actual:[" + nodeNameFromKey + "]"); - return virtualView; - } - return getVirtualViewByContextKeyValue(contextId, nodeName, contextKeyValue); - } - virtualView.getSource().getDataSourceContent().put("tableName", csTable.getName()); - if (csTable.getDb() != null) { - virtualView.getSource().getDataSourceContent().put("dbName", csTable.getDb().getName()); - } - if (StringUtils.isNotBlank(csTable.getLocation())) { - virtualView.getSource().getDataSourceContent().put("location", csTable.getLocation()); - } - return virtualView; - } - - public static VirtualView getVirtualViewByContextKeyValue(String contextId, String nodeName, ContextKeyValue contextKeyValue) throws ErrorException { - CSTable csTable = (CSTable) (contextKeyValue.getContextValue().getValue()); - VirtualView virtualView = new VirtualView(); - VirtualSource virtualSource = new VirtualSource(); - virtualSource.setCreator(VisualisUtils.VG_CREATOR().getValue()); - virtualSource.setEngineType(VisualisUtils.SPARK().getValue()); - virtualSource.setDataSourceType("context"); - Map dataSourceContent = Maps.newHashMap(); - dataSourceContent.put("contextId", contextId); - dataSourceContent.put("nodeName", nodeName); - dataSourceContent.put("contextKey", SerializeHelper.serializeContextKey(contextKeyValue.getContextKey())); - dataSourceContent.put("tableName", csTable.getName()); - if (csTable.getDb() != null) { - dataSourceContent.put("dbName", csTable.getDb().getName()); - } - if (StringUtils.isNotBlank(csTable.getLocation())) { - dataSourceContent.put("location", csTable.getLocation()); - } - virtualSource.setDataSourceContent(dataSourceContent); - - virtualView.setSource(virtualSource); - virtualView.setName(csTable.getName()); - Map model = Maps.newLinkedHashMap(); - for (Column column : csTable.getColumns()) { - String sqlType = column.getType().toUpperCase(); - String visualType = ResultHelper.toVisualType(sqlType); - String modelType = ResultHelper.NUMBER_TYPE().equals(visualType) ? "value" : "category"; - VisualViewModel visualViewModel = new VisualViewModel(); - visualViewModel.setSqlType(sqlType); - visualViewModel.setVisualType(visualType); - visualViewModel.setModelType(modelType); - model.put(column.getName(), visualViewModel); - } - virtualView.setModel(LinkisUtils.gson().toJson(model)); - virtualView.setParams(Maps.newHashMap()); - return virtualView; - } - - public static String encodeContextId(String contextId) { - return encoder.encodeToString(contextId.getBytes()); - } - - public static String decodeContextId(String encodedContextId) { - return new String(decoder.decode(encodedContextId)); - } - - public static String getCreateTempViewScala(String tempViewName, String resultLocation) { - return "org.apache.spark.sql.execution.datasources.csv.DolphinToSpark.createTempView(spark,\"" - + tempViewName + "\",\"" + resultLocation + "\", true);"; - } - - public static String getQueryTempViewScala(String sql, String createTempViewScala) { - return "val sql = \"\"\" " + sql + "\"\"\"\n" - + createTempViewScala - + "show(spark.sql(sql))"; - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ParamsRestful.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ParamsRestful.java deleted file mode 100644 index f15c89be2..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ParamsRestful.java +++ /dev/null @@ -1,69 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.restful; - -import com.webank.wedatasphere.dss.visualis.service.ParamsService; -import edp.core.annotation.MethodLog; -import edp.davinci.common.controller.BaseController; -import edp.davinci.core.common.Constants; -import edp.davinci.core.common.ResultMap; -import edp.davinci.model.*; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.collections.CollectionUtils; -import org.apache.linkis.server.security.SecurityFilter; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.http.MediaType; -import org.springframework.http.ResponseEntity; -import org.springframework.web.bind.annotation.*; - -import javax.servlet.http.HttpServletRequest; -import java.util.List; -import java.util.Map; - -@Slf4j -@RestController -@RequestMapping(path = Constants.RESTFUL_BASE_PATH + "params", produces = MediaType.APPLICATION_JSON_VALUE) -@ComponentScan(basePackages = {"edp", "com.webank.wedatasphere.dss"}) -public class ParamsRestful extends BaseController { - - @Autowired - private ParamsService paramsService; - - @MethodLog - @RequestMapping(path = "create", method = RequestMethod.POST) - public ResponseEntity createParams(HttpServletRequest req, @RequestBody Params params) { - - if (CollectionUtils.isEmpty(params.getParamDetails())) { - ResultMap resultMap = new ResultMap().fail().message("create param error."); - return ResponseEntity.ok(resultMap); - } - try { - paramsService.insertParams(params); - } catch (Exception e) { - log.error("create param fail, because: ", e); - return ResponseEntity.ok(new ResultMap().fail().message(e.getMessage())); - } - return ResponseEntity.ok(new ResultMap().success().payload(params)); - } - - - @MethodLog - @RequestMapping(path = "info", method = RequestMethod.GET) - public ResponseEntity getGraphInfo(HttpServletRequest req, @RequestParam String projectName) { - - String userName = SecurityFilter.getLoginUsername(req); - List> graphInfo; - - try { - graphInfo = paramsService.getGraphInfo(projectName, userName); - } catch (Exception e) { - log.error("access info error, because: ", e); - return ResponseEntity.ok(new ResultMap().fail().message(e.getMessage())); - } - - if (null == graphInfo) { - return ResponseEntity.ok(new ResultMap().fail().message("Project name is incorrect.")); - } - - return ResponseEntity.ok(new ResultMap().success().payload(graphInfo)); - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ParamsRestfulApi.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ParamsRestfulApi.java new file mode 100644 index 000000000..73dc8dc82 --- /dev/null +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ParamsRestfulApi.java @@ -0,0 +1,144 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.restful; +import com.alibaba.fastjson.JSONObject; +import com.google.common.collect.Iterables; +import com.google.common.collect.Lists; +import com.google.common.collect.Maps; +import com.webank.wedatasphere.linkis.server.Message; +import com.webank.wedatasphere.linkis.server.security.SecurityFilter; +import edp.core.common.job.ScheduleService; +import edp.davinci.core.common.Constants; +import edp.davinci.dao.*; +import edp.davinci.model.*; +import edp.davinci.service.ShareService; +import lombok.extern.slf4j.Slf4j; +import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.lang.StringUtils; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.context.annotation.ComponentScan; +import org.springframework.stereotype.Component; + +import javax.servlet.http.HttpServletRequest; +import javax.ws.rs.*; +import javax.ws.rs.core.Context; +import javax.ws.rs.core.MediaType; +import javax.ws.rs.core.Response; +import java.util.List; +import java.util.Map; +import java.util.UUID; +/** + * Created by shanhuang on 2019/1/23. + */ +@Slf4j +@Path(Constants.RESTFUL_BASE_PATH + "params") +@Component +@Produces(MediaType.APPLICATION_JSON) +@Consumes(MediaType.APPLICATION_JSON) +@ComponentScan(basePackages = {"edp","com.webank.wedatasphere.dss"}) +public class ParamsRestfulApi { + + @Autowired + private ParamsMapper paramsMapper; + + @Autowired + private ProjectMapper projectMapper; + + @Autowired + private UserMapper userMapper; + + @Autowired + private DashboardPortalMapper dashboardPortalMapper; + + @Autowired + private DashboardMapper dashboardMapper; + + @Autowired + private MemDashboardWidgetMapper memDashboardWidgetMapper; + + @Autowired + private WidgetMapper widgetMapper; + + @Autowired + private ScheduleService scheduleService; + + + @POST + @Path("create") + public Response createParams(@Context HttpServletRequest req, Params params) { + Message message = null; + + if(CollectionUtils.isEmpty(params.getParamDetails())){ + message = Message.error("Params body cannot be empty"); + return Message.messageToResponse(message); + } + + params.setUuid(UUID.randomUUID().toString()); + params.setParams(JSONObject.toJSONString(params.getParamDetails())); + + paramsMapper.insert(params); + message = Message.ok().data("params", params); + return Message.messageToResponse(message); + } + + @GET + @Path("info") + public Response getGraphInfo(@Context HttpServletRequest req, @QueryParam("projectName") String projectName) { + Message message = null; + + String userName = SecurityFilter.getLoginUsername(req); + User user = userMapper.selectByUsername(userName); + + Project project = Iterables.getFirst(projectMapper.getProjectByNameWithUserId(projectName, user.getId()), null); + if(project == null) { + message = Message.error("Project does not exist"); + return Message.messageToResponse(message); + } + + List> dashboardsInfo = Lists.newArrayList(); + List dashboardPortals = dashboardPortalMapper.getByProject(project.getId()); + for(DashboardPortal portal : dashboardPortals){ + List dashboardList = dashboardMapper.getByPortalId(portal.getId()); + for(Dashboard dashboard : dashboardList){ + Map dashboardInfo = Maps.newHashMap(); + dashboardInfo.put("dashboardId", dashboard.getId()); + dashboardInfo.put("name", dashboard.getName()); + dashboardInfo.put("url", scheduleService.getContentUrl(user.getId(), "dashboard", dashboard.getId())); + + List> widgetsInfo = Lists.newArrayList(); + List memDashboardWidgets = memDashboardWidgetMapper.getByDashboardId(dashboard.getId()); + for(MemDashboardWidget memDashboardWidget : memDashboardWidgets){ + Map widgetInfo = Maps.newHashMap(); + Widget widget = widgetMapper.getById(memDashboardWidget.getWidgetId()); + widgetInfo.put("widgetId", widget.getId()); + widgetInfo.put("name", widget.getName()); + widgetInfo.put("viewId", widget.getViewId()); + widgetInfo.put("url", scheduleService.getContentUrl(user.getId(), "widget", widget.getId())); + widgetsInfo.add(widgetInfo); + } + dashboardInfo.put("widgets", widgetsInfo); + dashboardsInfo.add(dashboardInfo); + } + } + + + message = Message.ok().data("dashboards", dashboardsInfo); + return Message.messageToResponse(message); + } + + +} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ProjectRestful.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ProjectRestful.java deleted file mode 100644 index d6fde338e..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ProjectRestful.java +++ /dev/null @@ -1,132 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.restful; - -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; -import com.webank.wedatasphere.dss.visualis.service.DssProjectService; -import edp.core.annotation.MethodLog; -import edp.core.utils.CollectionUtils; -import edp.davinci.core.common.Constants; -import edp.davinci.core.common.ResultMap; -import edp.davinci.dao.ProjectMapper; -import edp.davinci.dao.UserMapper; -import edp.davinci.model.Project; -import edp.davinci.model.User; -import lombok.extern.slf4j.Slf4j; -import org.apache.linkis.server.Message; -import org.apache.linkis.server.security.SecurityFilter; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.http.MediaType; -import org.springframework.http.ResponseEntity; -import org.springframework.web.bind.annotation.*; - -import javax.annotation.Resource; -import javax.servlet.http.HttpServletRequest; -import java.util.Date; -import java.util.List; -import java.util.Map; - -@Slf4j -@RestController -@RequestMapping(path = Constants.RESTFUL_BASE_PATH + "project", produces = MediaType.APPLICATION_JSON_VALUE) -@ComponentScan(basePackages = {"edp", "com.webank.wedatasphere.dss"}) -public class ProjectRestful { - - @Resource(name = "dssProjectService") - DssProjectService dssProjectService; - - @Autowired - UserMapper userMapper; - - @Autowired - ProjectMapper projectMapper; - - /** - * 该接口由Scriptis结果集可视化分析功能请求 - * */ - @MethodLog - @RequestMapping(path = "default", method = RequestMethod.GET) - public Message getDefault(HttpServletRequest req) { - String userName = SecurityFilter.getLoginUsername(req); - User user = userMapper.selectByUsername(userName); - List defaultProjects = projectMapper.getProjectByNameWithUserId(CommonConfig.DEFAULT_PROJECT_NAME().getValue(), user.getId()); - Project project = null; - if (CollectionUtils.isEmpty(defaultProjects)) { - project = new Project(); - project.setName(CommonConfig.DEFAULT_PROJECT_NAME().getValue()); - project.setCreateTime(new Date()); - project.setCreateUserId(user.getId()); - project.setDescription(""); - project.setInitialOrgId(null); - project.setIsTransfer(false); - project.setPic(null); - project.setStarNum(0); - project.setVisibility(true); - project.setOrgId(null); - project.setUserId(user.getId()); - projectMapper.insert(project); - } else { - project = defaultProjects.get(0); - } - Message message = Message.ok().data("project", project); - return message; - } - - - @MethodLog - @RequestMapping(path = "export", method = RequestMethod.POST) - public ResponseEntity exportProject(HttpServletRequest req, @RequestBody Map params) { - String userName = SecurityFilter.getLoginUsername(req); - ResultMap resultMap = null; - try { - resultMap = dssProjectService.exportProject(params, userName); - } catch (Exception e) { - log.error("export project error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - @MethodLog - @RequestMapping(path = "import", method = RequestMethod.POST) - public ResponseEntity importProject(HttpServletRequest req, @RequestBody Map params) { - String userName = SecurityFilter.getLoginUsername(req); - ResultMap resultMap = null; - try { - resultMap = dssProjectService.importProject(params, userName); - } catch (Exception e) { - log.error("import project error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - @MethodLog - @RequestMapping(path = "read", method = RequestMethod.GET) - public ResponseEntity read(HttpServletRequest req, @RequestParam(value = "fileName", required = false) String fileName, - @RequestParam(value = "projectId", required = false) Long projectId) { - String userName = SecurityFilter.getLoginUsername(req); - ResultMap resultMap = null; - try { - resultMap = dssProjectService.readProject(fileName, projectId, userName); - } catch (Exception e) { - log.error("read project error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - @MethodLog - @RequestMapping(path = "copy", method = RequestMethod.POST) - public ResponseEntity copy(HttpServletRequest req, @RequestBody Map params) { - String userName = SecurityFilter.getLoginUsername(req); - ResultMap resultMap = null; - try { - resultMap = dssProjectService.copyProject(params, userName); - } catch (Exception e) { - log.error("copy project error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ViewRestful.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ViewRestful.java deleted file mode 100644 index aa3c5249d..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ViewRestful.java +++ /dev/null @@ -1,100 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.restful; - -import com.webank.wedatasphere.dss.visualis.service.DssViewService; -import com.webank.wedatasphere.dss.visualis.model.DWCResultInfo; -import edp.core.annotation.MethodLog; -import edp.davinci.common.controller.BaseController; -import edp.davinci.core.common.Constants; -import edp.davinci.core.common.ResultMap; -import lombok.extern.slf4j.Slf4j; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.http.MediaType; -import org.springframework.http.ResponseEntity; -import org.springframework.web.bind.annotation.PathVariable; -import org.springframework.web.bind.annotation.RequestMapping; -import org.springframework.web.bind.annotation.RequestMethod; -import org.springframework.web.bind.annotation.RestController; - -import javax.annotation.Resource; -import javax.servlet.http.HttpServletRequest; -import java.util.List; - -@Slf4j -@RestController -@RequestMapping(path = Constants.RESTFUL_BASE_PATH + "view", produces = MediaType.APPLICATION_JSON_VALUE) -@ComponentScan(basePackages = {"edp", "com.webank.wedatasphere.dss"}) -public class ViewRestful extends BaseController { - - - @Resource(name = "dssViewService") - private DssViewService dssViewService; - - @MethodLog - @RequestMapping(path = "enginetypes", method = RequestMethod.GET) - public ResponseEntity getAvailableEngineTypes(HttpServletRequest req, Long id) { - List engineTypes; - try { - engineTypes = dssViewService.getAvailableEngineTypes(req, id); - } catch (Exception e) { - log.error("read project error, because: " , e); - return ResponseEntity.ok(new ResultMap().fail().message(e.getMessage())); - } - return ResponseEntity.ok(new ResultMap().success().payload(engineTypes)); - } - - @MethodLog - @RequestMapping(method = RequestMethod.POST) - public ResponseEntity createView(HttpServletRequest req, DWCResultInfo dwcResultInfo) { - ResultMap resultMap = null; - try { - resultMap = dssViewService.createView(req, dwcResultInfo); - } catch (Exception e) { - log.error("create view error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - - @MethodLog - @RequestMapping(path = "{id}/getdata", method = RequestMethod.GET) - public ResponseEntity getViewData(HttpServletRequest req, @PathVariable("id") Long id) { - ResultMap resultMap = null; - try { - resultMap = dssViewService.getViewData(req, id); - } catch (Exception e) { - log.error("get view data error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - - @MethodLog - @RequestMapping(path = "{id}/async/submit", method = RequestMethod.GET) - public ResponseEntity asyncSubmitSql(HttpServletRequest req, @PathVariable("id") Long id) { - ResultMap resultMap = null; - try { - resultMap = dssViewService.submitQuery(req, id); - } catch (Exception e) { - log.error("submit view error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - - } - - - @MethodLog - @RequestMapping(path = "{id}/type/source", method = RequestMethod.GET) - public ResponseEntity isHiveDataSource(HttpServletRequest req, @PathVariable("id") Long id) { - ResultMap resultMap = null; - try { - resultMap = dssViewService.isHiveDataSource(req, id); - } catch (Exception e) { - log.error("get view source error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ViewRestfulApi.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ViewRestfulApi.java new file mode 100644 index 000000000..337a8bf74 --- /dev/null +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/ViewRestfulApi.java @@ -0,0 +1,151 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.restful; + +import com.webank.wedatasphere.dss.visualis.utils.HttpUtils; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import com.webank.wedatasphere.linkis.server.Message; +import com.webank.wedatasphere.linkis.server.security.SecurityFilter; +import com.webank.wedatasphere.dss.visualis.entrance.spark.SqlCodeParse; +import com.webank.wedatasphere.dss.visualis.exception.VGErrorException; +import com.webank.wedatasphere.dss.visualis.model.DWCResultInfo; +import com.webank.wedatasphere.dss.visualis.res.ResultHelper; +import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; +import edp.davinci.core.common.Constants; +import edp.davinci.dao.ProjectMapper; +import edp.davinci.dao.UserMapper; +import edp.davinci.dao.ViewMapper; +import edp.davinci.model.Project; +import edp.davinci.model.Source; +import edp.davinci.model.User; +import edp.davinci.model.View; +import edp.davinci.service.ProjectService; +import edp.davinci.service.SourceService; +import edp.davinci.service.ViewService; +import lombok.extern.slf4j.Slf4j; +import org.apache.commons.lang.StringUtils; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.context.annotation.ComponentScan; +import org.springframework.stereotype.Component; +import org.springframework.transaction.annotation.Transactional; + +import javax.servlet.http.HttpServletRequest; +import javax.ws.rs.*; +import javax.ws.rs.core.Context; +import javax.ws.rs.core.MediaType; +import javax.ws.rs.core.Response; +import java.util.List; + +/** + * Created by johnnwang on 2019/1/21. + */ +@Slf4j +@Path(Constants.RESTFUL_BASE_PATH + "view") +@Component +@Produces(MediaType.APPLICATION_JSON) +@Consumes(MediaType.APPLICATION_JSON) +@ComponentScan(basePackages = {"edp","com.webank.wedatasphere.dss"}) +public class ViewRestfulApi { + + @Autowired + private ProjectService projectService; + + @Autowired + private ViewMapper viewMapper; + + @Autowired + private ProjectMapper projectMapper; + + @Autowired + private UserMapper userMapper; + + @Autowired + private SourceService sourceService; + + @Autowired + private ViewService viewService; + + @POST + public Response createView(@Context HttpServletRequest req, DWCResultInfo dwcResultInfo) { + Message message = null; + try { + String userName = SecurityFilter.getLoginUsername(req); + User user = userMapper.selectByUsername(userName); + Project project = projectMapper.getProejctsByUser(user.getId()).get(0); + + + if (project == null) { + message = Message.error("用户没有默认的项目,请联系管理员"); + return Message.messageToResponse(message); + + } + if (dwcResultInfo == null) { + message = Message.error("结果为空,无法做可视化分析"); + return Message.messageToResponse(message); + } + if(StringUtils.isEmpty(dwcResultInfo.getExecutionCode())){ + message = Message.error("脚本为空,无法做可视化分析"); + return Message.messageToResponse(message); + } + String[] sqlList = SqlCodeParse.parse(dwcResultInfo.getExecutionCode()); + int index = dwcResultInfo.getResultNumber(); + String code=""; + if(index < sqlList.length){ + code = sqlList[index]; + } + View view = new View(); + view.setProjectId(project.getId()); + view.setName(VisualisUtils.createTmpViewName(user.getName())); + + List sources= sourceService.getSources(project.getId(), user, HttpUtils.getUserTicketId(req)); + for(Source source : sources){ + if(VisualisUtils.isHiveDataSource(source)){ + view.setSourceId(source.getId()); + } + } + + view.setSql(code); + view.setModel(ResultHelper.toModelItem(dwcResultInfo.getResultPath())); + view.setConfig("{\"" + VisualisUtils.DWC_RESULT_INFO().getValue() + "\":" + BDPJettyServerHelper.gson().toJson(dwcResultInfo) + "}"); + try { + view = createView(view, user); + message = Message.ok(); + message.data("id", view.getId()); + message.data("projectId",view.getProjectId()); + } catch (VGErrorException e) { + log.error("可视化分析失败:", e); + message = Message.error("可视化分析失败:" + e.getMessage()); + } + return Message.messageToResponse(message); + } catch (Throwable e) { + log.error("可视化分析失败:", e); + message = Message.error("可视化分析失败:" + e.getMessage()); + return Message.messageToResponse(message); + } + } + + + + @Transactional + public View createView(View view, User user) throws VGErrorException { + int id = viewMapper.insert(view); + if (id < 0) { + throw new VGErrorException(70002, "将view 插入数据库失败"); + } + return view; + } +} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/WidgetRestful.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/WidgetRestful.java deleted file mode 100644 index f24468d6a..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/WidgetRestful.java +++ /dev/null @@ -1,119 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.restful; - -import com.webank.wedatasphere.dss.visualis.service.DssWidgetService; -import edp.core.annotation.MethodLog; -import edp.davinci.common.controller.BaseController; -import edp.davinci.core.common.Constants; -import edp.davinci.core.common.ResultMap; -import lombok.extern.slf4j.Slf4j; -import org.apache.linkis.cs.common.utils.CSCommonUtils; -import org.apache.linkis.server.security.SecurityFilter; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.http.MediaType; -import org.springframework.http.ResponseEntity; -import org.springframework.web.bind.annotation.*; - -import javax.servlet.http.HttpServletRequest; -import java.util.Map; - - -@Slf4j -@RestController -@RequestMapping(path = Constants.RESTFUL_BASE_PATH + "widget", produces = MediaType.APPLICATION_JSON_VALUE) -@ComponentScan(basePackages = {"edp", "com.webank.wedatasphere.dss"}) -public class WidgetRestful extends BaseController { - - private static Logger logger = LoggerFactory.getLogger(WidgetRestful.class); - - @Autowired - DssWidgetService dssWidgetService; - - /** - * DSS工作流拖拽创建一个Widget的步骤: - * 1. 创建widget /api/rest_j/v1/visualis/widget/smartcreate - * 2. 设置该widget的CSID /api/rest_j/v1/visualis/widget/setcontext - * */ - - @MethodLog - @RequestMapping(path = "rename", method = RequestMethod.POST) - public ResponseEntity rename(HttpServletRequest req, @RequestBody Map params) { - - String userName = SecurityFilter.getLoginUsername(req); - ResultMap resultMap = null; - - try { - resultMap = dssWidgetService.rename(params); - } catch (Exception e) { - log.error("rename widget error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - @MethodLog - @RequestMapping(path = "smartcreate", method = RequestMethod.POST) - public ResponseEntity smartCreateFromSql(HttpServletRequest req, @RequestBody Map params) { - - ResultMap resultMap = null; - String userName = SecurityFilter.getLoginUsername(req); - - try { - resultMap = dssWidgetService.smartCreateFromSql(userName, params); - } catch (Exception e) { - log.error("rename widget error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - @MethodLog - @RequestMapping(path = "setcontext", method = RequestMethod.POST) - public ResponseEntity setcontext(HttpServletRequest req, @RequestBody Map params) { - - Long widgetId = ((Integer) params.getOrDefault("id", -1)).longValue(); - String contextId = ((String) params.getOrDefault(CSCommonUtils.CONTEXT_ID_STR, "")); - - ResultMap resultMap = null; - - try { - resultMap = dssWidgetService.updateContextId(widgetId, contextId); - } catch (Exception e) { - log.error("set widget context error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - @MethodLog - @RequestMapping(path = "{id}/getdata", method = RequestMethod.GET) - public ResponseEntity getWidgetData(HttpServletRequest req, @PathVariable("id") Long id) { - String userName = SecurityFilter.getLoginUsername(req); - ResultMap resultMap = null; - try { - resultMap = dssWidgetService.getWidgetData(userName, id); - } catch (Exception e) { - log.error("get widget data error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - return ResponseEntity.ok(resultMap); - } - - - @MethodLog - @RequestMapping(path = "{type}/{id}/metadata", method = RequestMethod.GET) - public ResponseEntity compareWithSnapshot(HttpServletRequest req, @PathVariable("type") String type, @PathVariable("id") Long id) { - String userName = SecurityFilter.getLoginUsername(req); - ResultMap resultMap = null; - try { - resultMap = dssWidgetService.compareWithSnapshot(userName, type, id); - } catch (Exception e) { - log.error("get widget metadata error, because: " , e); - resultMap = new ResultMap().fail().message(e.getMessage()); - } - - return ResponseEntity.ok(resultMap); - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/WidgetResultfulApi.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/WidgetResultfulApi.java new file mode 100644 index 000000000..ab4a9c11f --- /dev/null +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/restful/WidgetResultfulApi.java @@ -0,0 +1,95 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.restful; + +import com.webank.wedatasphere.linkis.server.Message; +import edp.davinci.core.common.Constants; +import edp.davinci.dao.ProjectMapper; +import edp.davinci.dao.ViewMapper; +import edp.davinci.dao.WidgetMapper; +import edp.davinci.model.Project; +import edp.davinci.model.View; +import edp.davinci.model.Widget; +import lombok.extern.slf4j.Slf4j; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +import javax.servlet.http.HttpServletRequest; +import javax.ws.rs.*; +import javax.ws.rs.core.Context; +import javax.ws.rs.core.MediaType; +import javax.ws.rs.core.Response; +import java.util.Map; + +/** + * Created by johnnwang on 2019/1/24. + */ +@Slf4j +@Path(Constants.RESTFUL_BASE_PATH + "widgets") +@Component +@Produces(MediaType.APPLICATION_JSON) +@Consumes(MediaType.APPLICATION_JSON) +public class WidgetResultfulApi { + + @Autowired + private ViewMapper viewMapper; + + @Autowired + private WidgetMapper widgetMapper; + + @Autowired + private ProjectMapper projectMapper; + + @POST + public Response updateProjectId(@Context HttpServletRequest req, Map json) { + Message message = null; + try { + Long widgetId = ((Integer) json.getOrDefault("widgetId", -1)).longValue(); + Long projectId = ((Integer) json.getOrDefault("projectId", -1)).longValue(); + Long viewId = ((Integer) json.getOrDefault("viewId", -1)).longValue(); + + Project project = projectMapper.getById(projectId.longValue()); + if (project == null) { + message = Message.error("项目不存在"); + return Message.messageToResponse(message); + } + Widget widget = widgetMapper.getById(widgetId); + if (widget == null) { + message = Message.error("保存图表到数据开发失败,对应的图表不存在"); + return Message.messageToResponse(message); + } + widget.setProjectId(projectId); + widgetMapper.update(widget); + View view = viewMapper.getById(viewId); + if (view != null) { + view.setName(widget.getName()); + view.setProjectId(projectId); + viewMapper.update(view); + } + message = Message.ok(); + message.data("widgetId", widgetId); + message.data("projectId", projectId); + message.data("viewId", viewId); + return Message.messageToResponse(message); + } catch (Exception e) { + log.error("保存图表失败:", e); + message = Message.error("保存图表失败:" + e.getMessage()); + return Message.messageToResponse(message); + } + } + +} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssDashboradService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssDashboradService.java deleted file mode 100644 index 59bcfae49..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssDashboradService.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; - -import java.util.Map; -import java.util.Set; - -public interface DssDashboradService { - - void exportDashboardPortals(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject); - - void importDashboard(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog); - - void copyDashboardPortal(Map> moduleIdsMap, ExportedProject exportedProject); -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssDisplayService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssDisplayService.java deleted file mode 100644 index 53cd6ba9a..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssDisplayService.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; - -import java.util.Map; -import java.util.Set; - -public interface DssDisplayService { - - void exportDisplays(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject) throws Exception; - - void importDisplay(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) throws Exception; - - void copyDisplay(Map> moduleIdsMap, ExportedProject exportedProject) throws Exception; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssProjectService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssProjectService.java deleted file mode 100644 index d61b08c65..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssProjectService.java +++ /dev/null @@ -1,33 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - -import edp.davinci.core.common.ResultMap; - -import java.util.Map; - -public interface DssProjectService { - - /** - * get default project. - */ - ResultMap getDefaultProject(String userName) throws Exception; - - /** - * 工程导出 - */ - ResultMap exportProject(Map params, String userName) throws Exception; - - /** - * 工程导入 - */ - ResultMap importProject(Map params, String userName) throws Exception; - - /** - * 工程复制 - */ - ResultMap copyProject(Map params, String userName) throws Exception; - - /** - * 读取工程 - */ - ResultMap readProject(String fileName, Long projectId, String userName) throws Exception; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssSourceService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssSourceService.java deleted file mode 100644 index 380c575da..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssSourceService.java +++ /dev/null @@ -1,10 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; - -public interface DssSourceService { - - void importSource(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) throws Exception; - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssViewService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssViewService.java deleted file mode 100644 index 227d78f80..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssViewService.java +++ /dev/null @@ -1,30 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - -import com.webank.wedatasphere.dss.visualis.model.DWCResultInfo; -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import edp.davinci.core.common.ResultMap; - -import javax.servlet.http.HttpServletRequest; -import java.util.List; -import java.util.Map; -import java.util.Set; - -public interface DssViewService { - - List getAvailableEngineTypes(HttpServletRequest req, Long id) throws Exception; - - ResultMap createView(HttpServletRequest req, DWCResultInfo dwcResultInfo) throws Exception; - - ResultMap getViewData(HttpServletRequest req, Long id) throws Exception; - - ResultMap submitQuery(HttpServletRequest req, Long id) throws Exception; - - ResultMap isHiveDataSource(HttpServletRequest req, Long id) throws Exception; - - void exportViews(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject) throws Exception; - - void importViews(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) throws Exception; - - void copyView(Map> moduleIdsMap, ExportedProject exportedProject) throws Exception; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssWidgetService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssWidgetService.java deleted file mode 100644 index e701978a4..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/DssWidgetService.java +++ /dev/null @@ -1,31 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import edp.davinci.core.common.ResultMap; - -import java.util.Map; -import java.util.Set; - -public interface DssWidgetService { - - /** - * When the DSS workflow executes, the ContextID is updated, - * return ture update success, otherwise update false. - */ - ResultMap rename(Map params) throws Exception; - - ResultMap smartCreateFromSql(String userName, Map params) throws Exception; - - ResultMap updateContextId(Long widgetId, String contextId) throws Exception; - - ResultMap getWidgetData(String userName, Long widgetId) throws Exception; - - ResultMap compareWithSnapshot(String userName, String type, Long id) throws Exception; - - void exportWidgets(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject) throws Exception; - - void importWidget(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) throws Exception; - - void copyWidget(String contextIdStr, Map> moduleIdsMap, ExportedProject exportedProject) throws Exception; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/ParamsService.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/ParamsService.java deleted file mode 100644 index f26d1a2c7..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/ParamsService.java +++ /dev/null @@ -1,14 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - - -import edp.davinci.model.Params; - -import java.util.List; -import java.util.Map; - -public interface ParamsService { - - void insertParams(Params params) throws Exception; - - List> getGraphInfo(String projectName, String userName) throws Exception; -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/Utils.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/Utils.java deleted file mode 100644 index 42096d9cc..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/Utils.java +++ /dev/null @@ -1,86 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service; - -import com.google.common.collect.Maps; -import com.google.common.collect.Sets; -import com.webank.wedatasphere.dss.visualis.utils.StringConstant; -import org.apache.commons.lang.StringUtils; - -import java.util.Arrays; -import java.util.Map; -import java.util.Set; -import java.util.regex.Matcher; -import java.util.regex.Pattern; -import java.util.stream.Collectors; - -public class Utils { - - public static Map> getModuleIdsMap(Map params) { - - Map> map = Maps.newHashMap(); - String widgetIdsStr = params.get(StringConstant.WIDGET_IDS); - String displayIdsStr = params.get(StringConstant.DISPLAY_IDS); - String dashboardPortalIdsStr = params.get(StringConstant.DASHBOARD_PORTAL_IDS); - String viewIdsStr = params.get(StringConstant.VIEW_IDS); - - Set widgetIds = Sets.newHashSet(); - Set displayIds = Sets.newHashSet(); - Set dashboardPortalIds = Sets.newHashSet(); - Set viewIds = Sets.newHashSet(); - - if (StringUtils.isNotEmpty(widgetIdsStr)) { - widgetIds = Arrays.stream(StringUtils.split(widgetIdsStr, StringConstant.COMMA)) - .map(Long::parseLong).collect(Collectors.toSet()); - } - if (StringUtils.isNotEmpty(displayIdsStr)) { - displayIds = Arrays.stream(StringUtils.split(displayIdsStr, StringConstant.COMMA)) - .map(Long::parseLong).collect(Collectors.toSet()); - } - if (StringUtils.isNotEmpty(dashboardPortalIdsStr)) { - dashboardPortalIds = Arrays.stream(StringUtils.split(dashboardPortalIdsStr, StringConstant.COMMA)) - .map(Long::parseLong).collect(Collectors.toSet()); - } - if (StringUtils.isNotEmpty(viewIdsStr)) { - viewIds = Arrays.stream(StringUtils.split(viewIdsStr, StringConstant.COMMA)) - .map(Long::parseLong).collect(Collectors.toSet()); - } - - map.put(StringConstant.WIDGET_IDS, widgetIds); - map.put(StringConstant.DISPLAY_IDS, displayIds); - map.put(StringConstant.DASHBOARD_PORTAL_IDS, dashboardPortalIds); - map.put(StringConstant.VIEW_IDS, viewIds); - - return map; - } - - public static String updateName(String name, String versionSuffix) { - if (StringUtils.isBlank(versionSuffix)) { - return name; - } - String shortName = getShortName(name); - return shortName + "_" + versionSuffix; - } - - private static String getShortName(String longName) { - String shortName; - String version = getSuffixVersion(longName); - if (null == version) { - shortName = longName; - } else { - shortName = longName.substring(0, longName.length() - version.length() - 1); - } - return shortName; - } - - private static String getSuffixVersion(String longName) { - String version; - Pattern suffixVersionPattern = Pattern.compile("[v]\\d_[v][0-9]{6}"); - Matcher matcherVersionPattern = suffixVersionPattern.matcher(longName); - if (matcherVersionPattern.find()) { - version = matcherVersionPattern.group(); - } else { - version = null; - } - return version; - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HiveDBHelper.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/hive/HiveDBHelper.java similarity index 65% rename from server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HiveDBHelper.java rename to server/src/main/java/com/webank/wedatasphere/dss/visualis/service/hive/HiveDBHelper.java index 4a041bdae..98c1946f4 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HiveDBHelper.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/hive/HiveDBHelper.java @@ -1,28 +1,55 @@ -package com.webank.wedatasphere.dss.visualis.utils; +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.service.hive; import com.google.common.collect.Lists; import com.google.gson.Gson; import com.webank.wedatasphere.dss.visualis.model.HiveSource; -import com.webank.wedatasphere.dss.visualis.model.hivemodel.HiveColumnModel; -import com.webank.wedatasphere.dss.visualis.model.hivemodel.HiveDBModel; -import com.webank.wedatasphere.dss.visualis.model.hivemodel.HiveTableModel; +import com.webank.wedatasphere.dss.visualis.utils.HttpUtils; +import com.webank.wedatasphere.dss.visualis.utils.model.HiveColumnModel; +import com.webank.wedatasphere.dss.visualis.utils.model.HiveDBModel; +import com.webank.wedatasphere.dss.visualis.utils.model.HiveSchemaModel; +import com.webank.wedatasphere.dss.visualis.utils.model.HiveTableModel; +import com.webank.wedatasphere.linkis.adapt.LinkisUtils; import edp.core.model.QueryColumn; import edp.core.model.TableInfo; import edp.core.utils.TokenUtils; +import edp.davinci.core.common.ResultMap; import edp.davinci.model.Source; +import edp.davinci.model.User; import org.apache.commons.lang.StringUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; +import javax.servlet.http.HttpServletRequest; import java.util.ArrayList; import java.util.List; +/** + * created by cooperyang on 2019/1/23 + * Description: + */ + @Component -public class HiveDBHelper { +public class HiveDBHelper{ private static final Logger logger = LoggerFactory.getLogger(HiveDBHelper.class); @@ -31,37 +58,37 @@ public class HiveDBHelper { @Autowired private TokenUtils tokenUtils; - public List getHiveDBNames(String ticketId) { - if (ticketId == null) { + public List getHiveDBNames(String ticketId){ + if (ticketId == null){ logger.error("cookie 中没有ticketID, 不能进行对hive数据库的操作"); return null; } String hiveDBJson = HttpUtils.getDbs(ticketId); - if (StringUtils.isEmpty(hiveDBJson)) { + if (StringUtils.isEmpty(hiveDBJson)){ logger.info("从database这个服务获取的内容为空,不能进行解析数据库名的操作,将返回null"); return null; } HiveDBModel hiveDBModel = new Gson().fromJson(hiveDBJson, HiveDBModel.class); List dbNames = new ArrayList<>(); - for (HiveDBModel.HiveDB db : hiveDBModel.getData().getDbs()) { + for (HiveDBModel.HiveDB db : hiveDBModel.getData().getDbs()){ dbNames.add(db.getDbName()); } return dbNames; } - public List getHiveTables(String dbName, String ticketId) { - if (ticketId == null) { + public List getHiveTables(String dbName, String ticketId){ + if (ticketId == null){ logger.error("cookie 中没有ticketID, 不能进行对hive数据库的操作"); return null; } String dbTableJson = HttpUtils.getTables(ticketId, dbName); - if (StringUtils.isEmpty(dbTableJson)) { + if (StringUtils.isEmpty(dbTableJson)){ logger.info("从database这个服务获取的内容为空,不能进行解析table名的操作,将返回null"); return null; } List queryColumns = Lists.newArrayList(); HiveTableModel hiveTableModel = new Gson().fromJson(dbTableJson, HiveTableModel.class); - for (HiveTableModel.HiveTable hiveTable : hiveTableModel.getData().getTables()) { + for(HiveTableModel.HiveTable hiveTable : hiveTableModel.getData().getTables()){ String type = hiveTable.isView() ? "VIEW" : "TABLE"; QueryColumn queryColumn = new QueryColumn(hiveTable.getTableName(), type); queryColumns.add(queryColumn); @@ -69,39 +96,39 @@ public List getHiveTables(String dbName, String ticketId) { return queryColumns; } - public TableInfo getHiveTableInfo(String dbName, String tableName, String ticketId) { - if (ticketId == null) { + public TableInfo getHiveTableInfo(String dbName, String tableName, String ticketId){ + if (ticketId == null){ logger.error("cookie 中没有ticketID, 不能进行对hive数据库的操作"); return null; } String columnJson = HttpUtils.getColumns(dbName, tableName, ticketId); - if (StringUtils.isEmpty(columnJson)) { + if (StringUtils.isEmpty(columnJson)){ logger.info("从database这个服务获取的内容为空,不能进行解析columns的操作,将返回null"); return null; } List queryColumns = Lists.newArrayList(); HiveColumnModel hiveColumnModel = new Gson().fromJson(columnJson, HiveColumnModel.class); - for (HiveColumnModel.Column column : hiveColumnModel.getData().getColumns()) { + for(HiveColumnModel.Column column : hiveColumnModel.getData().getColumns()){ QueryColumn queryColumn = new QueryColumn(column.getColumnName(), column.getColumnType()); queryColumns.add(queryColumn); } TableInfo tableInfo = new TableInfo(tableName, Lists.newArrayList(), queryColumns); - return tableInfo; + return tableInfo; } - public List sourcesToHiveSources(List sources) { + public List sourcesToHiveSources(List sources){ List retList = new ArrayList<>(); - if (sources != null && sources.size() > 0) { - for (Source source : sources) { + if (sources != null && sources.size() > 0){ + for(Source source : sources){ retList.add(sourceToHiveSource(source)); } } return retList; } - public HiveSource sourceToHiveSource(Source source) { + public HiveSource sourceToHiveSource(Source source){ HiveSource hiveSource = new HiveSource(); hiveSource.setId(source.getId()); hiveSource.setName(source.getName()); diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/DashboardServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/DashboardServiceImpl.java deleted file mode 100644 index 916919759..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/DashboardServiceImpl.java +++ /dev/null @@ -1,142 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service.impl; - -import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.visualis.service.DssDashboradService; -import com.webank.wedatasphere.dss.visualis.service.Utils; -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import com.webank.wedatasphere.dss.visualis.model.optmodel.PlainDashboard; -import com.webank.wedatasphere.dss.visualis.model.optmodel.PlainDashboardPortal; -import com.webank.wedatasphere.dss.visualis.utils.StringConstant; -import edp.davinci.dao.DashboardMapper; -import edp.davinci.dao.DashboardPortalMapper; -import edp.davinci.dao.MemDashboardWidgetMapper; -import edp.davinci.model.Dashboard; -import edp.davinci.model.DashboardPortal; -import edp.davinci.model.MemDashboardWidget; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang.StringUtils; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -import java.util.List; -import java.util.Map; -import java.util.Set; - -@Slf4j -@Service("dssDashboradService") -public class DashboardServiceImpl implements DssDashboradService { - - @Autowired - private DashboardMapper dashboardMapper; - - @Autowired - private DashboardPortalMapper dashboardPortalMapper; - - @Autowired - private MemDashboardWidgetMapper memDashboardWidgetMapper; - - @Override - public void exportDashboardPortals(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject) { - List exportedDashboardPortals = Lists.newArrayList(); - List dashboardPortals = Lists.newArrayList(); - if (partial) { - Set idsSet = moduleIdsMap.get(StringConstant.DASHBOARD_PORTAL_IDS); - if (idsSet.size() > 0) { - idsSet.stream().map(dashboardPortalMapper::getById).forEach(dashboardPortals::add); - } - - } else { - dashboardPortals = dashboardPortalMapper.getByProject(projectId); - } - for (DashboardPortal dashboardPortal : dashboardPortals) { - PlainDashboardPortal plainDashboardPortal = new PlainDashboardPortal(); - List exportedDashboards = Lists.newArrayList(); - List dashboards = dashboardMapper.getByPortalId(dashboardPortal.getId()); - for (Dashboard dashboard : dashboards) { - PlainDashboard exportedDashboard = new PlainDashboard(); - exportedDashboard.setDashboard(dashboard); - List memDashboardWidgets = memDashboardWidgetMapper.getByDashboardId(dashboard.getId()); - memDashboardWidgets.forEach(m -> moduleIdsMap.get(StringConstant.WIDGET_IDS).add(m.getWidgetId())); - exportedDashboard.setMemDashboardWidgets(memDashboardWidgets); - exportedDashboards.add(exportedDashboard); - } - plainDashboardPortal.setDashboardPortal(dashboardPortal); - plainDashboardPortal.setDashboards(exportedDashboards); - exportedDashboardPortals.add(plainDashboardPortal); - } - exportedProject.setDashboardPortals(exportedDashboardPortals); - log.info("exporting project, export dashboardPortals: {}", exportedProject); - } - - @Override - public void importDashboard(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) { - List dashboardPortals = exportedProject.getDashboardPortals(); - if (dashboardPortals == null) { - return; - } - for (PlainDashboardPortal plainDashboardPortal : dashboardPortals) { - DashboardPortal dashboardPortal = plainDashboardPortal.getDashboardPortal(); - Long oldPortalId = dashboardPortal.getId(); - dashboardPortal.setProjectId(projectId); - dashboardPortal.setName(Utils.updateName(dashboardPortal.getName(), versionSuffix)); - - // 导入dashboardPortal,判断是否存在已有同名Portal - Long existingPortalId = dashboardPortalMapper.getByNameWithProjectId(dashboardPortal.getName(), projectId); - if (existingPortalId != null) { - // 存在portalId - dashboardPortal.setId(existingPortalId); - dashboardMapper.deleteByPortalId(dashboardPortal.getId()); - memDashboardWidgetMapper.deleteByPortalId(dashboardPortal.getId()); - idCatalog.getDashboardPortal().put(oldPortalId, existingPortalId); - } else { - // 不存在portalId - dashboardPortalMapper.insert(dashboardPortal); - // 这个id是否是新的portalId, - idCatalog.getDashboardPortal().put(oldPortalId, dashboardPortal.getId()); - } - //导入dashboard - for (PlainDashboard plainDashboard : plainDashboardPortal.getDashboards()) { - Dashboard dashboard = plainDashboard.getDashboard(); - Long oldDashboardId = dashboard.getId(); - dashboard.setDashboardPortalId(dashboardPortal.getId()); - - dashboardMapper.insert(dashboard); - idCatalog.getDashboard().put(oldDashboardId, dashboard.getId()); - //导入dashboard与widget关系 - for (MemDashboardWidget memDashboardWidget : plainDashboard.getMemDashboardWidgets()) { - Long oldMemId = memDashboardWidget.getId(); - memDashboardWidget.setDashboardId(dashboard.getId()); - memDashboardWidget.setWidgetId(idCatalog.getWidget().get(memDashboardWidget.getWidgetId())); - memDashboardWidgetMapper.insert(memDashboardWidget); - idCatalog.getMemDashboardWidget().put(oldMemId, memDashboardWidget.getId()); - } - } - //是否存在parentId - for (PlainDashboard plainDashboard : plainDashboardPortal.getDashboards()) { - Dashboard dashboard = plainDashboard.getDashboard(); - Long parentId = dashboard.getParentId() == 0 ? 0L : idCatalog.getDashboard().get(dashboard.getParentId()); - dashboard.setParentId(parentId); - if (StringUtils.isNotBlank(dashboard.getFullParentId())) { - List ids = Lists.newArrayList(); - for (String old : dashboard.getFullParentId().split(StringConstant.COMMA)) { - ids.add(idCatalog.getDashboard().get(Long.parseLong(old))); - } - dashboard.setFullParentId(StringUtils.join(ids, StringConstant.COMMA)); - } - dashboardMapper.update(dashboard); - } - } - } - - @Override - public void copyDashboardPortal(Map> moduleIdsMap, ExportedProject exportedProject) { - Set dashboardPortalIds = moduleIdsMap.get(StringConstant.DASHBOARD_PORTAL_IDS); - if (!dashboardPortalIds.isEmpty()) { - PlainDashboardPortal plainDashboardPortal = exportedProject.getDashboardPortals().get(0); - exportedProject.setDashboardPortals(Lists.newArrayList(plainDashboardPortal)); - } - } - - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/DisplayServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/DisplayServiceImpl.java deleted file mode 100644 index 25630d5db..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/DisplayServiceImpl.java +++ /dev/null @@ -1,107 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service.impl; - -import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.visualis.service.DssDisplayService; -import com.webank.wedatasphere.dss.visualis.service.Utils; -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import com.webank.wedatasphere.dss.visualis.model.optmodel.PlainDisplay; -import com.webank.wedatasphere.dss.visualis.utils.StringConstant; -import edp.davinci.dao.DisplayMapper; -import edp.davinci.dao.DisplaySlideMapper; -import edp.davinci.dao.MemDisplaySlideWidgetMapper; -import edp.davinci.model.Display; -import edp.davinci.model.DisplaySlide; -import edp.davinci.model.MemDisplaySlideWidget; -import lombok.extern.slf4j.Slf4j; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -import java.util.List; -import java.util.Map; -import java.util.Set; - - -@Slf4j -@Service("dssDisplayService") -public class DisplayServiceImpl implements DssDisplayService { - - @Autowired - DisplayMapper displayMapper; - - @Autowired - DisplaySlideMapper displaySlideMapper; - - @Autowired - private MemDisplaySlideWidgetMapper memDisplaySlideWidgetMapper; - - @Override - public void exportDisplays(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject) throws Exception { - List exportedDisplays = Lists.newArrayList(); - List displays = Lists.newArrayList(); - if (partial) { - Set idsSet = moduleIdsMap.get(StringConstant.DISPLAY_IDS); - if (idsSet.size() > 0) { - idsSet.stream().map(displayMapper::getById).forEach(displays::add); - } - } else { - displays = displayMapper.getByProject(projectId); - } - for (Display display : displays) { - PlainDisplay plainDisplay = new PlainDisplay(); - plainDisplay.setDisplay(display); - plainDisplay.setDisplaySlide(displaySlideMapper.selectByDisplayId(display.getId()).get(0)); - List memDisplaySlideWidgets = memDisplaySlideWidgetMapper.getMemDisplaySlideWidgetListBySlideId(plainDisplay.getDisplaySlide().getId()); - memDisplaySlideWidgets.forEach(m -> moduleIdsMap.get(StringConstant.WIDGET_IDS).add(m.getWidgetId())); - plainDisplay.setMemDisplaySlideWidgets(memDisplaySlideWidgets); - exportedDisplays.add(plainDisplay); - } - log.info("exporting project, export displays: {}", exportedProject); - exportedProject.setDisplays(exportedDisplays); - } - - @Override - public void importDisplay(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) throws Exception { - List displays = exportedProject.getDisplays(); - if (displays == null) { - return; - } - for (PlainDisplay plainDisplay : displays) { - Display display = plainDisplay.getDisplay(); - Long oldDisplayId = display.getId(); - display.setProjectId(projectId); - display.setName(Utils.updateName(display.getName(), versionSuffix)); - Long existingId = displayMapper.getByNameWithProjectId(display.getName(), projectId); - if (existingId != null) { - display.setId(existingId); - displaySlideMapper.deleteByDisplayId(display.getId()); - memDisplaySlideWidgetMapper.deleteByDisplayId(display.getId()); - idCatalog.getDisplay().put(oldDisplayId, display.getId()); - } else { - displayMapper.insert(display); - idCatalog.getDisplay().put(oldDisplayId, display.getId()); - } - DisplaySlide displaySlide = plainDisplay.getDisplaySlide(); - Long oldSlideId = displaySlide.getId(); - displaySlide.setDisplayId(display.getId()); - displaySlideMapper.insert(displaySlide); - idCatalog.getDisplaySlide().put(oldSlideId, displaySlide.getId()); - for (MemDisplaySlideWidget memDisplaySlideWidget : plainDisplay.getMemDisplaySlideWidgets()) { - Long oldMemId = memDisplaySlideWidget.getId(); - memDisplaySlideWidget.setDisplaySlideId(displaySlide.getId()); - memDisplaySlideWidget.setWidgetId(idCatalog.getWidget().get(memDisplaySlideWidget.getWidgetId())); - memDisplaySlideWidgetMapper.insert(memDisplaySlideWidget); - idCatalog.getMemDisplaySlideWidget().put(oldMemId, memDisplaySlideWidget.getId()); - } - } - } - - @Override - public void copyDisplay(Map> moduleIdsMap, ExportedProject exportedProject) throws Exception { - Set displayIds = moduleIdsMap.get(StringConstant.DISPLAY_IDS); - if (!displayIds.isEmpty()) { - PlainDisplay display = exportedProject.getDisplays().get(0); - exportedProject.setDisplays(Lists.newArrayList(display)); - } - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ParamsServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ParamsServiceImpl.java deleted file mode 100644 index 0863273c7..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ParamsServiceImpl.java +++ /dev/null @@ -1,100 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service.impl; - -import com.alibaba.fastjson.JSONObject; -import com.google.common.collect.Iterables; -import com.google.common.collect.Lists; -import com.google.common.collect.Maps; -import com.webank.wedatasphere.dss.visualis.service.ParamsService; -import com.webank.wedatasphere.dss.visualis.content.CommonContant; -import com.webank.wedatasphere.dss.visualis.content.DashboardContant; -import com.webank.wedatasphere.dss.visualis.content.ViewContant; -import com.webank.wedatasphere.dss.visualis.content.WidgetContant; -import edp.core.common.job.ScheduleService; -import edp.davinci.dao.*; -import edp.davinci.model.*; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -import java.util.List; -import java.util.Map; -import java.util.UUID; - -@Service -public class ParamsServiceImpl implements ParamsService { - - private static Logger logger = LoggerFactory.getLogger(ParamsServiceImpl.class); - - @Autowired - private ParamsMapper paramsMapper; - - @Autowired - private UserMapper userMapper; - - @Autowired - private ProjectMapper projectMapper; - - @Autowired - private DashboardPortalMapper dashboardPortalMapper; - - @Autowired - private DashboardMapper dashboardMapper; - - @Autowired - private ScheduleService scheduleService; - - @Autowired - private MemDashboardWidgetMapper memDashboardWidgetMapper; - - @Autowired - private WidgetMapper widgetMapper; - - @Override - public void insertParams(Params params) throws Exception { - - params.setUuid(UUID.randomUUID().toString()); - params.setParams(JSONObject.toJSONString(params.getParamDetails())); - - logger.info("Params is creating, Params is {}", params); - paramsMapper.insert(params); - } - - @Override - public List> getGraphInfo(String projectName, String userName) { - User user = userMapper.selectByUsername(userName); - - Project project = Iterables.getFirst(projectMapper.getProjectByNameWithUserId(projectName, user.getId()), null); - if (project == null) { - logger.error("Project does not exist"); - return null; - } - - List> dashboardsInfo = Lists.newArrayList(); - List dashboardPortals = dashboardPortalMapper.getByProject(project.getId()); - for (DashboardPortal portal : dashboardPortals) { - List dashboardList = dashboardMapper.getByPortalId(portal.getId()); - for (Dashboard dashboard : dashboardList) { - Map dashboardInfo = Maps.newHashMap(); - dashboardInfo.put(DashboardContant.DASHBOARD_ID, dashboard.getId()); - dashboardInfo.put(DashboardContant.NAME, dashboard.getName()); - dashboardInfo.put(CommonContant.URL, scheduleService.getContentUrl(user.getId(), DashboardContant.DASHBOARD, dashboard.getId())); - - List> widgetsInfo = Lists.newArrayList(); - List memDashboardWidgets = memDashboardWidgetMapper.getByDashboardId(dashboard.getId()); - for (MemDashboardWidget memDashboardWidget : memDashboardWidgets) { - Map widgetInfo = Maps.newHashMap(); - Widget widget = widgetMapper.getById(memDashboardWidget.getWidgetId()); - widgetInfo.put(WidgetContant.WIDGET_ID, widget.getId()); - widgetInfo.put(WidgetContant.NAME, widget.getName()); - widgetInfo.put(ViewContant.VIEW_ID, widget.getViewId()); - widgetInfo.put(CommonContant.URL, scheduleService.getContentUrl(user.getId(), WidgetContant.WIDGET, widget.getId())); - widgetsInfo.add(widgetInfo); - } - dashboardInfo.put(WidgetContant.WIDGETS, widgetsInfo); - dashboardsInfo.add(dashboardInfo); - } - } - return dashboardsInfo; - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ProjectServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ProjectServiceImpl.java deleted file mode 100644 index fb2f3e2b6..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ProjectServiceImpl.java +++ /dev/null @@ -1,304 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service.impl; - -import com.webank.wedatasphere.dss.visualis.service.DssProjectService; -import com.webank.wedatasphere.dss.visualis.service.DssSourceService; -import com.webank.wedatasphere.dss.visualis.service.DssViewService; -import com.webank.wedatasphere.dss.visualis.service.DssWidgetService; -import com.webank.wedatasphere.dss.visualis.service.DssDisplayService; -import com.webank.wedatasphere.dss.visualis.service.DssDashboradService; -import com.webank.wedatasphere.dss.visualis.service.Utils; -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; -import com.webank.wedatasphere.dss.visualis.enums.ModuleEnum; -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import com.webank.wedatasphere.dss.visualis.utils.StringConstant; -import edp.davinci.core.common.ResultMap; -import org.apache.commons.io.FileUtils; -import org.apache.linkis.adapt.LinkisUtils; -import org.apache.linkis.bml.client.BmlClient; -import org.apache.linkis.bml.client.BmlClientFactory; -import org.apache.linkis.bml.protocol.BmlDownloadResponse; -import org.apache.linkis.bml.protocol.BmlUploadResponse; -import org.apache.linkis.common.exception.ErrorException; -import edp.core.exception.ServerException; -import edp.davinci.dao.ProjectMapper; -import edp.davinci.dao.UserMapper; -import edp.davinci.model.Project; -import edp.davinci.model.User; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.collections.CollectionUtils; -import org.apache.commons.io.IOUtils; -import org.apache.commons.lang.StringUtils; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -import javax.annotation.Resource; -import java.io.ByteArrayInputStream; -import java.io.File; -import java.nio.charset.StandardCharsets; -import java.util.*; - -@Slf4j -@Service("dssProjectService") -public class ProjectServiceImpl implements DssProjectService { - - @Autowired - private UserMapper userMapper; - - @Autowired - private ProjectMapper projectMapper; - - @Resource(name = "dssSourceService") - private DssSourceService dssSourceService; - - @Resource(name = "dssViewService") - private DssViewService dssViewService; - - @Resource(name = "dssWidgetService") - private DssWidgetService dssWidgetService; - - @Resource(name = "dssDisplayService") - private DssDisplayService dssDisplayService; - - @Resource(name = "dssDashboradService") - private DssDashboradService dssDashboradService; - - @Override - public ResultMap getDefaultProject(String userName) throws Exception { - - Map resultDataMap = new HashMap<>(); - ResultMap resultMap = new ResultMap(); - - Project project; - User user = userMapper.selectByUsername(userName); - List defaultProjects = projectMapper.getProjectByNameWithUserId(CommonConfig.DEFAULT_PROJECT_NAME().getValue(), user.getId()); - if (CollectionUtils.isEmpty(defaultProjects)) { - project = new Project(); - project.setName(CommonConfig.DEFAULT_PROJECT_NAME().getValue()); - project.setCreateTime(new Date()); - project.setCreateUserId(user.getId()); - project.setDescription(StringConstant.EMPTY); - project.setInitialOrgId(null); - project.setIsTransfer(false); - project.setPic(null); - project.setStarNum(0); - project.setVisibility(true); - project.setOrgId(null); - project.setUserId(user.getId()); - projectMapper.insert(project); - } else { - project = defaultProjects.get(0); - } - resultDataMap.put("project", project); - return resultMap.success().payload(resultDataMap); - } - - /** - * 导出工程的过程步骤: - * 1. 获取 DSS传递过来的导出的工程JSON. - * 2. 从JSON中获取工程ID和全量导出标识. - * 3. 通过工程ID,设置导出工程的以下信息: - * 3.1. 导出关联的Display. - * 3.2. 导出关联的DashBoard. - * 3.3. 导出关联的Widget. - * 3.4. 导出关联的View. - * 3.5. 导出关联的Source. - * 3.6. 生成一个ExportedProject对象. - * 4. 把ExportedProject对象转换成JSON,上传至BML. - * 5. 将返回的BML 资源ID和版本号设置到响应消息中并返回. - */ - @Override - public ResultMap exportProject(Map params, String userName) throws Exception { - - Map resultDataMap = new HashMap<>(); - ResultMap resultMap = new ResultMap(); - - ExportedProject exportedProject; - - Long projectId = Long.parseLong(params.get(StringConstant.PROJECT_ID)); - Boolean partial = Boolean.parseBoolean(params.get(StringConstant.PARTIAL)); - Map> moduleIdsMap = Utils.getModuleIdsMap(params); - - exportedProject = doExport(projectId, moduleIdsMap, partial); - - String exported = LinkisUtils.gson().toJson(exportedProject); - BmlClient bmlClient = BmlClientFactory.createBmlClient(userName); - BmlUploadResponse bmlUploadResponse = bmlClient.uploadShareResource(userName, exportedProject.getName(), - StringConstant.BML_FILE_PREFIX + UUID.randomUUID(), new ByteArrayInputStream(exported.getBytes(StandardCharsets.UTF_8))); - - if (bmlUploadResponse == null || !bmlUploadResponse.isSuccess()) { - throw new ServerException("cannot upload exported data to BML"); - } - - log.info("{} is exporting the project, uploaded to BML the resourceID is {} and the version is {}", - userName, bmlUploadResponse.resourceId(), bmlUploadResponse.version()); - - resultDataMap.put("resourceId", bmlUploadResponse.resourceId()); - resultDataMap.put("version", bmlUploadResponse.version()); - - resultMap.success().payload(resultDataMap); - return resultMap; - } - - @Override - public ResultMap importProject(Map params, String userName) throws Exception { - - Map resultDataMap = new HashMap<>(); - ResultMap resultMap = new ResultMap(); - - String resourceId = params.get(StringConstant.RESOURCE_ID); - String version = params.get(StringConstant.VERSION); - Long projectId = Long.parseLong(params.get(StringConstant.PROJECT_ID)); - String projectVersion = params.get(StringConstant.PROJECT_VERSION); - String flowVersion = params.get(StringConstant.WORKFLOW_VERSION); - String versionSuffix = projectVersion + "_" + flowVersion; - BmlClient bmlClient = BmlClientFactory.createBmlClient(userName); - BmlDownloadResponse bmlDownloadResponse = bmlClient.downloadShareResource(userName, resourceId, version); - if (bmlDownloadResponse == null || !bmlDownloadResponse.isSuccess()) { - throw new ServerException("cannot download exported data from BML"); - } - try { - String projectJson = IOUtils.toString(bmlDownloadResponse.inputStream()); - - IdCatalog idCatalog = doImport(projectJson, projectId, versionSuffix); - - resultDataMap.put("widget", idCatalog.getWidget()); - resultDataMap.put("dashboard", idCatalog.getDashboard()); - resultDataMap.put("dashboardPortal", idCatalog.getDashboardPortal()); - resultDataMap.put("view", idCatalog.getView()); - resultDataMap.put("display", idCatalog.getDisplay()); - } finally { - IOUtils.closeQuietly(bmlDownloadResponse.inputStream()); - } - return resultMap.success().payload(resultDataMap); - } - - - @Override - public ResultMap copyProject(Map params, String userName) throws Exception { - - Map resultDataMap = new HashMap<>(); - ResultMap resultMap = new ResultMap(); - - log.info("begin to copy in visualis params is {}", params); - - Map> moduleIdsMap = Utils.getModuleIdsMap(params); - - String projectVersion = params.getOrDefault(StringConstant.PROJECT_VERSION, StringConstant.PROJECT_VERSION_DEFAULT_VALUE); - String flowVersion = params.get(StringConstant.WORKFLOW_VERSION); - if (StringUtils.isEmpty(flowVersion)) { - log.error("flowVersion is null, can not copy flow to a newest version"); - flowVersion = StringConstant.WORKFLOW_VERSION_DEFAULT_VALUE; - } - String contextIdStr = params.get(StringConstant.CONTEXT_ID); - if (StringUtils.isEmpty(contextIdStr)) { - throw new ErrorException(20012, "contextId is null, visualis can not do copy"); - } - - Long projectId = getProjectId(moduleIdsMap); - - ExportedProject exportedProject = doExport(projectId, moduleIdsMap, true); - - doCopy(contextIdStr, moduleIdsMap, exportedProject); - - String projectJson = LinkisUtils.gson().toJson(exportedProject); - String versionSuffix = projectVersion + "_" + flowVersion; - - IdCatalog idCatalog = doImport(projectJson, projectId, versionSuffix); - - resultDataMap.put("widget", idCatalog.getWidget()); - resultDataMap.put("dashboard", idCatalog.getDashboard()); - resultDataMap.put("dashboardPortal", idCatalog.getDashboardPortal()); - resultDataMap.put("display", idCatalog.getDisplay()); - resultDataMap.put("view", idCatalog.getView()); - - return resultMap.success().payload(resultDataMap); - } - - @Override - public ResultMap readProject(String fileName, Long projectId, String userName) throws Exception { - - Map resultDataMap = new HashMap<>(); - ResultMap resultMap = new ResultMap(); - - String versionSuffix = ""; - String projectJson = FileUtils.readFileToString(new File(CommonConfig.EXPORT_PROJECT_DIR().getValue() + fileName)); - - IdCatalog idCatalog = doImport(projectJson, projectId, versionSuffix); - - resultDataMap.put("widget", idCatalog.getWidget()); - resultDataMap.put("dashboard", idCatalog.getDashboard()); - resultDataMap.put("dashboardPortal", idCatalog.getDashboardPortal()); - resultDataMap.put("view", idCatalog.getView()); - resultDataMap.put("display", idCatalog.getDisplay()); - - return resultMap.success().payload(resultDataMap); - } - - private ExportedProject doExport(Long projectId, Map> moduleIdsMap, boolean partial) throws Exception { - ExportedProject exportedProject = new ExportedProject(); - Project project = projectMapper.getById(projectId); - exportedProject.setName(project.getName()); - - // moduleIdsMap为导出的组件Map - - dssDisplayService.exportDisplays(projectId, moduleIdsMap, partial, exportedProject); - - dssDashboradService.exportDashboardPortals(projectId, moduleIdsMap, partial, exportedProject); - - dssWidgetService.exportWidgets(projectId, moduleIdsMap, partial, exportedProject); - - dssViewService.exportViews(projectId, moduleIdsMap, partial, exportedProject); - - return exportedProject; - } - - public IdCatalog doImport(String projectJson, Long projectId, String versionSuffix) throws Exception { - ExportedProject exportedProject = LinkisUtils.gson().fromJson(projectJson, ExportedProject.class); - - // idCatalog为记录导出节点时,其导出的关联关系的记录Map - IdCatalog idCatalog = new IdCatalog(); - - dssSourceService.importSource(projectId, versionSuffix, exportedProject, idCatalog); - - dssViewService.importViews(projectId, versionSuffix, exportedProject, idCatalog); - - dssWidgetService.importWidget(projectId, versionSuffix, exportedProject, idCatalog); - - dssDisplayService.importDisplay(projectId, versionSuffix, exportedProject, idCatalog); - - dssDashboradService.importDashboard(projectId, versionSuffix, exportedProject, idCatalog); - - return idCatalog; - } - - public void doCopy(String contextIdStr, Map> moduleIdsMap, ExportedProject exportedProject) throws Exception { - - dssWidgetService.copyWidget(contextIdStr, moduleIdsMap, exportedProject); - - dssDisplayService.copyDisplay(moduleIdsMap, exportedProject); - - dssDashboradService.copyDashboardPortal(moduleIdsMap, exportedProject); - - dssViewService.copyView(moduleIdsMap, exportedProject); - } - - public Long getProjectId(Map> moduleIdsMap) { - Set widgets = moduleIdsMap.get(ModuleEnum.WIDGET_IDS.getName()); - Set displays = moduleIdsMap.get(ModuleEnum.DISPLAY_IDS.getName()); - Set dashboards = moduleIdsMap.get(ModuleEnum.DASHBOARD_PORTAL_IDS.getName()); - Set views = moduleIdsMap.get(ModuleEnum.VIEW_IDS.getName()); - if (!widgets.isEmpty()) { - return projectMapper.getProjectIdByWidgetId(widgets.iterator().next()); - } else if (!displays.isEmpty()) { - return projectMapper.getProjectByDisplayId(displays.iterator().next()); - } else if (!dashboards.isEmpty()) { - return projectMapper.getProjectIdByDashboardId(dashboards.iterator().next()); - } else if (!views.isEmpty()) { - return projectMapper.getProjectIdByViewId(views.iterator().next()); - } else { - log.error("widgets displays dashboards are all empty"); - return -1L; - } - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/SourceServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/SourceServiceImpl.java deleted file mode 100644 index 02fe0eedb..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/SourceServiceImpl.java +++ /dev/null @@ -1,40 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service.impl; - -import com.webank.wedatasphere.dss.visualis.service.DssSourceService; -import com.webank.wedatasphere.dss.visualis.service.Utils; -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import edp.davinci.dao.SourceMapper; -import edp.davinci.model.Source; -import lombok.extern.slf4j.Slf4j; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -import java.util.List; - -@Slf4j -@Service("dssSourceService") -public class SourceServiceImpl implements DssSourceService { - - @Autowired - SourceMapper sourceMapper; - - @Override - public void importSource(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) throws Exception { - List sources = exportedProject.getSources(); - if (sources == null) { - return; - } - for (Source source : sources) { - Long oldId = source.getId(); - source.setProjectId(projectId); - Long existingId = sourceMapper.getByNameWithProjectId(source.getName(), projectId); - if (existingId != null) { - idCatalog.getSource().put(oldId, existingId); - } else { - sourceMapper.insert(source); - idCatalog.getSource().put(oldId, source.getId()); - } - } - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ViewServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ViewServiceImpl.java deleted file mode 100644 index bd63ca711..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/ViewServiceImpl.java +++ /dev/null @@ -1,315 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service.impl; - -import com.alibaba.fastjson.JSON; -import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.visualis.service.DssViewService; -import com.webank.wedatasphere.dss.visualis.entrance.spark.SqlCodeParse; -import com.webank.wedatasphere.dss.visualis.enums.ModuleEnum; -import com.webank.wedatasphere.dss.visualis.exception.VGErrorException; -import com.webank.wedatasphere.dss.visualis.model.DWCResultInfo; -import com.webank.wedatasphere.dss.visualis.model.PaginateWithExecStatus; -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import com.webank.wedatasphere.dss.visualis.res.ResultHelper; -import com.webank.wedatasphere.dss.visualis.utils.HttpUtils; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; -import edp.core.exception.NotFoundException; -import edp.core.model.PaginateWithQueryColumns; -import edp.davinci.core.common.ResultMap; -import edp.davinci.dao.ProjectMapper; -import edp.davinci.dao.SourceMapper; -import edp.davinci.dao.UserMapper; -import edp.davinci.dao.ViewMapper; -import edp.davinci.dto.viewDto.ViewExecuteSql; -import edp.davinci.model.*; -import edp.davinci.service.SourceService; -import edp.davinci.service.ViewService; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang.StringUtils; -import org.apache.linkis.server.BDPJettyServerHelper; -import org.apache.linkis.server.security.SecurityFilter; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; -import org.springframework.transaction.annotation.Transactional; - -import javax.servlet.http.HttpServletRequest; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Set; -import java.util.stream.Collectors; - -import static com.webank.wedatasphere.dss.visualis.service.Utils.updateName; - -@Slf4j -@Service("dssViewService") -public class ViewServiceImpl implements DssViewService { - - @Autowired - ViewMapper viewMapper; - - @Autowired - SourceMapper sourceMapper; - - @Autowired - SourceService sourceService; - - @Autowired - UserMapper userMapper; - - @Autowired - ProjectMapper projectMapper; - - @Autowired - ViewService viewService; - - - @Override - public List getAvailableEngineTypes(HttpServletRequest req, Long id) { - String userName = SecurityFilter.getLoginUsername(req); - List engineTypes; - if (id <= 0) { - engineTypes = Lists.newArrayList(VisualisUtils.SPARK().getValue()); - } else { - engineTypes = sourceService.getAvailableEngineTypes(userName); - } - return engineTypes; - } - - @Override - public ResultMap createView(HttpServletRequest req, DWCResultInfo dwcResultInfo) throws Exception { - - Map resultDataMap = new HashMap<>(); - ResultMap resultMap = new ResultMap(); - - try { - String userName = SecurityFilter.getLoginUsername(req); - User user = userMapper.selectByUsername(userName); - Project project = projectMapper.getProejctsByUser(user.getId()).get(0); - - if (project == null) { - throw new Exception("用户没有默认的项目,请联系管理员"); - } - if (dwcResultInfo == null) { - throw new Exception("结果为空,无法做可视化分析"); - } - if (StringUtils.isEmpty(dwcResultInfo.getExecutionCode())) { - throw new Exception("脚本为空,无法做可视化分析"); - } - String[] sqlList = SqlCodeParse.parse(dwcResultInfo.getExecutionCode()); - int index = dwcResultInfo.getResultNumber(); - String code = ""; - if (index < sqlList.length) { - code = sqlList[index]; - } - - View view = new View(); - view.setProjectId(project.getId()); - view.setName(VisualisUtils.createTmpViewName(user.getName())); - List sources = sourceService.getSources(project.getId(), user, HttpUtils.getUserTicketId(req)); - for (Source source : sources) { - if (VisualisUtils.isHiveDataSource(source)) { - view.setSourceId(source.getId()); - } - } - view.setSql(code); - view.setModel(ResultHelper.toModelItem(dwcResultInfo.getResultPath())); - view.setConfig("{\"" + VisualisUtils.DWC_RESULT_INFO().getValue() + "\":" + BDPJettyServerHelper.gson().toJson(dwcResultInfo) + "}"); - try { - view = createView(view); - resultDataMap.put("id", view.getId()); - resultDataMap.put("projectId", view.getProjectId()); - } catch (VGErrorException e) { - log.error("可视化分析失败:", e); - throw new Exception("脚本为空,无法做可视化分析", e.getCause()); - } - return resultMap.success().payload(resultDataMap); - } catch (Exception e) { - log.error("可视化分析失败:", e); - throw e; - } - } - - @Override - public ResultMap getViewData(HttpServletRequest req, Long id) throws Exception { - - ResultMap resultMap = new ResultMap(); - Map resultDataMap = new HashMap<>(); - - if (id == null) { - throw new Exception("viewId is null when dss execute view node"); - } - String userName = SecurityFilter.getLoginUsername(req); - User user = userMapper.selectByUsername(userName); - if (user == null) { - throw new Exception("user is empty when dss execute view node"); - } - View view = viewMapper.getById(id); - if (view == null) { - throw new Exception("viewInfo is empty when dss execute view node"); - } - ViewExecuteSql executeSql = new ViewExecuteSql(); - Long sourceId = view.getSourceId(); - if (sourceId == null) { - throw new Exception("sourceId is null when dss execute view node"); - } - executeSql.setSourceId(sourceId); - String sql = view.getSql(); - executeSql.setSql(sql); - String variableStr = view.getVariable(); - if (StringUtils.isNotEmpty(variableStr)) { - List variables = JSON.parseArray(variableStr, SqlVariable.class); - log.info("variables:{}", executeSql); - executeSql.setVariables(variables); - } - // view节点执行操作 - PaginateWithQueryColumns paginateWithQueryColumns = viewService.executeSql(executeSql, user); - if (paginateWithQueryColumns != null) { - resultDataMap.put("columns", paginateWithQueryColumns.getColumns()); - resultDataMap.put("resultList", paginateWithQueryColumns.getResultList()); - } else { - return resultMap.fail().payload("View执行失败."); - } - return resultMap.success().payload(resultDataMap); - } - - @Override - public ResultMap submitQuery(HttpServletRequest req, Long id) throws Exception { - - ResultMap resultMap = new ResultMap(); - Map resultDataMap = new HashMap<>(); - - if (id == null) { - throw new Exception("viewId is null when dss execute view node"); - } - String userName = SecurityFilter.getLoginUsername(req); - User user = userMapper.selectByUsername(userName); - if (user == null) { - throw new Exception("user is empty when dss execute view node"); - } - View view = viewMapper.getById(id); - if (view == null) { - throw new Exception("viewInfo is empty when dss execute view node"); - } - ViewExecuteSql executeSql = new ViewExecuteSql(); - Long sourceId = view.getSourceId(); - if (sourceId == null) { - throw new Exception("sourceId is null when dss execute view node"); - } - executeSql.setSourceId(sourceId); - String sql = view.getSql(); - executeSql.setSql(sql); - String variableStr = view.getVariable(); - if (StringUtils.isNotEmpty(variableStr)) { - List variables = JSON.parseArray(variableStr, SqlVariable.class); - log.info("variables:{}", executeSql); - executeSql.setVariables(variables); - } - // 异步执行view语句 - PaginateWithExecStatus paginateWithExecStatus = viewService.AsyncSubmitSql(executeSql, user); - if (paginateWithExecStatus != null) { - resultDataMap.put("paginateWithExecStatus", paginateWithExecStatus); - } else { - resultMap.fail().payload("view执行失败."); - } - return resultMap.success().payload(resultDataMap); - } - - @Override - public ResultMap isHiveDataSource(HttpServletRequest req, Long id) throws Exception { - - ResultMap resultMap = new ResultMap(); - Map resultDataMap = new HashMap<>(); - - if (id == null) { - throw new Exception("viewId is null when dss execute view node"); - } - String userName = SecurityFilter.getLoginUsername(req); - User user = userMapper.selectByUsername(userName); - if (user == null) { - throw new Exception("user is empty when dss execute view node"); - } - View view = viewMapper.getById(id); - if (view == null) { - throw new Exception("viewInfo is empty when dss execute view node"); - } - Long sourceId = view.getSourceId(); - if (sourceId == null) { - throw new Exception("sourceId is null when dss execute view node"); - } - Source source = sourceMapper.getById(sourceId); - if (null == source) { - throw new NotFoundException("source is not found"); - } - if (VisualisUtils.isLinkisDataSource(source)) { - resultDataMap.put("isLinkisDataSource", true); - } else { - resultDataMap.put("isLinkisDataSource", false); - } - return resultMap.success().payload(resultDataMap); - } - - @Override - public void exportViews(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject) throws Exception { - if (partial) { - Set longs = moduleIdsMap.get(ModuleEnum.VIEW_IDS.getName()); - if (longs.size() > 0) { - exportedProject.setViews(viewMapper.getByIds(longs)); - Set sourceIds = exportedProject.getViews().stream().map(View::getSourceId).collect(Collectors.toSet()); - List sources = sourceMapper.getByProject(projectId).stream().filter(s -> sourceIds.contains(s.getId())).collect(Collectors.toList()); - exportedProject.setSources(sources); - } - } else { - exportedProject.setSources(sourceMapper.getByProject(projectId)); - List exportedViews = Lists.newArrayList(); - for (Source source : exportedProject.getSources()) { - exportedViews.addAll(viewMapper.getBySourceId(source.getId())); - } - exportedProject.setViews(exportedViews); - } - log.info("exporting project, export views: {}", exportedProject); - } - - @Override - public void importViews(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) throws Exception { - List views = exportedProject.getViews(); - if (views == null) { - return; - } - for (View view : views) { - Long oldId = view.getId(); - view.setProjectId(projectId); - view.setName(updateName(view.getName(), versionSuffix)); - if (idCatalog.getSource().get(view.getSourceId()) != null) { - view.setSourceId(idCatalog.getSource().get(view.getSourceId())); - } - Long existingId = viewMapper.getByNameWithProjectId(view.getName(), projectId); - if (existingId != null) { - idCatalog.getView().put(oldId, existingId); - } else { - viewMapper.insert(view); - idCatalog.getView().put(oldId, view.getId()); - } - } - } - - @Override - public void copyView(Map> moduleIdsMap, ExportedProject exportedProject) throws Exception { - Set viewIds = moduleIdsMap.get(ModuleEnum.VIEW_IDS.getName()); - if (!viewIds.isEmpty()) { - View view = exportedProject.getViews().get(0); - exportedProject.setViews(Lists.newArrayList(view)); - } - } - - - @Transactional - private View createView(View view) throws VGErrorException { - int id = viewMapper.insert(view); - if (id < 0) { - throw new VGErrorException(70002, "将view 插入数据库失败"); - } - return view; - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/WidgetServiceImpl.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/WidgetServiceImpl.java deleted file mode 100644 index 4e87d8072..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/service/impl/WidgetServiceImpl.java +++ /dev/null @@ -1,412 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.service.impl; - -import com.alibaba.fastjson.JSONObject; -import com.google.common.collect.Iterables; -import com.google.common.collect.Lists; -import com.google.common.collect.Maps; -import com.google.common.collect.Sets; -import com.google.gson.JsonElement; -import com.google.gson.JsonObject; -import com.webank.wedatasphere.dss.visualis.service.Utils; -import com.webank.wedatasphere.dss.visualis.service.DssWidgetService; -import com.webank.wedatasphere.dss.visualis.content.WidgetContant; -import com.webank.wedatasphere.dss.visualis.exception.VGErrorException; -import com.webank.wedatasphere.dss.visualis.model.optmodel.ExportedProject; -import com.webank.wedatasphere.dss.visualis.model.optmodel.IdCatalog; -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; -import com.webank.wedatasphere.dss.visualis.query.service.VirtualViewQueryServiceImpl; -import com.webank.wedatasphere.dss.visualis.query.utils.QueryUtils; -import com.webank.wedatasphere.dss.visualis.utils.StringConstant; -import edp.core.exception.ServerException; -import edp.core.model.PaginateWithQueryColumns; -import edp.core.utils.CollectionUtils; -import edp.davinci.common.utils.ScriptUtils; -import edp.davinci.core.common.ResultMap; -import edp.davinci.dao.*; -import edp.davinci.dto.dashboardDto.DashboardWithPortal; -import edp.davinci.dto.displayDto.DisplayWithProject; -import edp.davinci.dto.viewDto.ViewExecuteParam; -import edp.davinci.dto.widgetDto.WidgetCreate; -import edp.davinci.model.*; -import edp.davinci.service.ViewService; -import edp.davinci.service.WidgetService; -import org.apache.linkis.adapt.LinkisUtils; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang.StringUtils; -import org.apache.linkis.common.exception.ErrorException; -import org.apache.linkis.cs.common.utils.CSCommonUtils; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -import java.text.SimpleDateFormat; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Set; -import java.util.stream.Collectors; - -import static edp.davinci.common.utils.ScriptUtils.getExecuptParamScriptEngine; - -@Slf4j -@Service("dssWidgetService") -public class WidgetServiceImpl implements DssWidgetService { - - private SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); - - private String migratedOldTime = "2000-05-26 18:34:01"; - - @Autowired - WidgetMapper widgetMapper; - - @Autowired - SourceMapper sourceMapper; - - @Autowired - ViewMapper viewMapper; - - @Autowired - UserMapper userMapper; - - @Autowired - DashboardMapper dashboardMapper; - - @Autowired - DisplayMapper displayMapper; - - @Autowired - WidgetService widgetService; - - @Autowired - ViewService viewService; - - @Autowired - VirtualViewQueryServiceImpl virtualViewQueryService; - - @Override - public ResultMap rename(Map params) throws Exception { - - ResultMap resultMap = new ResultMap(); - - Long widgetId = ((Integer) params.getOrDefault("id", -1)).longValue(); - String widgetName = ((String) params.getOrDefault("name", "")); - - Widget widget = widgetMapper.getById(widgetId); - widget.setName(widgetName); - Map configMap = LinkisUtils.gson().fromJson(widget.getConfig(), Map.class); - configMap.put("nodeName", widgetName); - if (configMap.get("view") != null && (configMap.get("view") instanceof Map)) { - Map viewMap = (Map) configMap.get("view"); - Map sourceMap = (Map) viewMap.get("source"); - Map dataSourceContentMap = (Map) sourceMap.get("dataSourceContent"); - dataSourceContentMap.put("nodeName", widgetName); - sourceMap.put("dataSourceContent", dataSourceContentMap); - viewMap.put("source", sourceMap); - configMap.put("view", viewMap); - } - widget.setConfig(LinkisUtils.gson().toJson(configMap)); - widgetMapper.update(widget); - - return resultMap.success(); - } - - @Override - public ResultMap smartCreateFromSql(String userName, Map params) throws Exception { - - ResultMap resultMap = new ResultMap(); - Map resultDataMap = new HashMap<>(); - - User user = userMapper.selectByUsername(userName); - - String widgetName = ((String) params.getOrDefault("widgetName", "")); - String viewName = ((String) params.getOrDefault("viewName", "")); - String viewSql = ((String) params.getOrDefault("viewSql", "")); - Long viewId = ((Integer) params.getOrDefault("viewId", -1)).longValue(); - Long projectId = ((Integer) params.getOrDefault("projectId", -1)).longValue(); - String nodeName = ((String) params.getOrDefault(CSCommonUtils.NODE_NAME_STR, "")); - String contextId = ((String) params.getOrDefault(CSCommonUtils.CONTEXT_ID_STR, "")); - String encodedContextId = QueryUtils.encodeContextId(contextId); - String description = (String) params.getOrDefault("description", ""); - - WidgetCreate widgetCreate = new WidgetCreate(); - widgetCreate.setName(widgetName); - widgetCreate.setProjectId(projectId); - widgetCreate.setPublish(true); - widgetCreate.setType(1L); - widgetCreate.setDescription(description); - String widgetConfig = ""; - - if (viewId < 0) { - if (StringUtils.isBlank(nodeName)) { - String contextInfo = "\"contextId\":\"" + "\", \"nodeName\":\"" + widgetName + "\","; - widgetConfig = StringUtils.replace(WidgetContant.WIDGET_CHART_CONFIG_TEMPLE, "${model_content}", "\"\""); - widgetConfig = StringUtils.replace(widgetConfig, "${context_info}", contextInfo); - } else { - VirtualView virtualView = QueryUtils.getExactFromContext(encodedContextId, nodeName); - if (virtualView == null) { - throw new ServerException("节点[" + nodeName + "]: 没有产生查询结果集,或非法绑定View节点"); - } - String contextInfo = "\"contextId\":\"" + "\", \"nodeName\":\"" + widgetName + "\",\"refNodeName\":\"" + nodeName + "\","; - widgetConfig = StringUtils.replace(WidgetContant.WIDGET_CHART_CONFIG_TEMPLE, "${model_content}", "\"\""); - widgetConfig = StringUtils.replace(widgetConfig, "${context_info}", contextInfo); - } - } else { - View view = viewMapper.getById(viewId); - widgetCreate.setViewId(viewId); - widgetConfig = StringUtils.replace(WidgetContant.WIDGET_CHART_CONFIG_TEMPLE, "${model_content}", view.getModel()); - widgetConfig = StringUtils.replace(widgetConfig, "${context_info}", ""); - } - widgetCreate.setConfig(widgetConfig); - Widget newWidget = widgetService.createWidget(widgetCreate, user); - - resultDataMap.put("widgetId", newWidget.getId()); - resultDataMap.put("widgetName", newWidget.getName()); - resultDataMap.put("viewId", viewId); - - return resultMap.success().payload(resultDataMap); - } - - @Override - public ResultMap updateContextId(Long widgetId, String contextId) throws Exception { - Widget widget = widgetMapper.getById(widgetId); - if (widget == null) { - throw new ServerException("Widget does not exist"); - } - Map configMap = LinkisUtils.gson().fromJson(widget.getConfig(), Map.class); - if (configMap.get("contextId") == null) { - throw new ServerException("This Widget does not have contextId"); - } - String encodedContextId = QueryUtils.encodeContextId(contextId); - String nodeName = (String) configMap.get("refNodeName"); - if (StringUtils.isNotBlank(nodeName)) { - try { - VirtualView virtualView = Iterables.getFirst(QueryUtils.getFromContext(encodedContextId, nodeName), null); - if (virtualView != null) { - configMap.put("view", virtualView); - } - } catch (ErrorException e) { - log.error("Get visualView error by ContextID: {} and nodeName: {}", contextId, nodeName); - throw new VGErrorException(20003, "get visualView error, due to error error contextId and nodeName."); - } - } - - Object viewObj = configMap.get("view"); - if (viewObj != null && (viewObj instanceof Map)) { - Map viewMap = (Map) viewObj; - if (viewMap.size() > 0) { - Map sourceMap = (Map) viewMap.get("source"); - Map dataSourceContentMap = (Map) sourceMap.get("dataSourceContent"); - dataSourceContentMap.put("contextId", contextId); - sourceMap.put("dataSourceContent", dataSourceContentMap); - viewMap.put("source", sourceMap); - configMap.put("view", viewMap); - } - } - log.info("widget:{}", widget); - configMap.put("contextId", QueryUtils.encodeContextId(contextId)); - widget.setConfig(LinkisUtils.gson().toJson(configMap)); - widgetMapper.update(widget); - - return new ResultMap().success(); - } - - @Override - public ResultMap getWidgetData(String userName, Long widgetId) throws Exception { - - ResultMap resultMap = new ResultMap(); - Map resultDataMap = new HashMap<>(); - - User user = userMapper.selectByUsername(userName); - Widget widget = widgetMapper.getById(widgetId); - - JSONObject configObject = JSONObject.parseObject(widget.getConfig()); - if (configObject.get("query") == null) { - log.warn("querying an empty widget"); - resultDataMap.put("columns", Lists.newArrayList()); - resultDataMap.put("resultList", Lists.newArrayList()); - return resultMap.success().payload(resultDataMap); - } - ViewExecuteParam viewExecuteParam = ScriptUtils.getViewExecuteParam(getExecuptParamScriptEngine(), null, widget.getConfig(), null); - PaginateWithQueryColumns paginate; - if (viewExecuteParam.getView() == null) { - paginate = (PaginateWithQueryColumns) viewService.getData(widget.getViewId(), viewExecuteParam, user, false); - } else { - //for production published - String encodedContextId = configObject.getString("contextId"); - if (StringUtils.isNotBlank(encodedContextId)) { - String newContextId = QueryUtils.decodeContextId(encodedContextId); - String oldContextId = viewExecuteParam.getView().getSource().getDataSourceContent().get("contextId"); - if (!newContextId.equals(oldContextId)) { - viewExecuteParam.getView().getSource().getDataSourceContent().put("contextId", newContextId); - configObject.put("view", JSONObject.toJSON(viewExecuteParam.getView())); - configObject.getJSONObject("view").put("model", - JSONObject.parse(viewExecuteParam.getView().getModel())); - widget.setConfig(JSONObject.toJSONString(configObject)); - widgetMapper.update(widget); - } - } - paginate = (PaginateWithQueryColumns) virtualViewQueryService.getData(viewExecuteParam, user, false); - } - resultDataMap.put("columns", paginate.getColumns()); - resultDataMap.put("resultList", paginate.getResultList()); - return resultMap.success().payload(resultDataMap); - } - - @Override - public ResultMap compareWithSnapshot(String userName, String type, Long id) throws Exception { - - ResultMap resultMap = new ResultMap(); - Map resultDataMap = new HashMap<>(); - - User user = userMapper.selectByUsername(userName); - - Project project = null; - Set widgets = Sets.newHashSet(); - if ("dashboard".equals(type)) { - DashboardWithPortal dashboardWithPortal = dashboardMapper.getDashboardWithPortalAndProject(id); - if (dashboardWithPortal == null) { - return resultMap.fail(); - } - project = dashboardWithPortal.getProject(); - widgets.addAll(widgetMapper.getByDashboard(id)); - } else if ("portal".equals(type)) { - List dashboards = dashboardMapper.getByPortalId(id); - if (CollectionUtils.isEmpty(dashboards)) { - return resultMap.fail(); - } - for (Dashboard dashboard : dashboards) { - DashboardWithPortal dashboardWithPortal = dashboardMapper.getDashboardWithPortalAndProject(dashboard.getId()); - project = dashboardWithPortal.getProject(); - widgets.addAll(widgetMapper.getByDashboard(dashboard.getId())); - } - } else { - DisplayWithProject displayWithProject = displayMapper.getDisplayWithProjectById(id); - if (displayWithProject == null) { - return resultMap.fail(); - } - project = displayWithProject.getProject(); - widgets.addAll(widgetMapper.getByDisplayId(id)); - } - List> widgetsMetaData = Lists.newArrayList(); - for (Widget widget : widgets) { - Map widgetMeta = Maps.newHashMap(); - widgetMeta.put("name", widget.getName()); - if (widget.getUpdateTime() == null) { - widgetMeta.put("updated", migratedOldTime); - } else { - widgetMeta.put("updated", simpleDateFormat.format(widget.getUpdateTime())); - } - widgetMeta.put("columns", StringUtils.join(getWidgetUsedColumns(widget.getConfig()), ";")); - widgetsMetaData.add(widgetMeta); - } - resultDataMap.put("projectName", project.getName()); - resultDataMap.put("widgetsMetaData", widgetsMetaData); - return resultMap.success().payload(resultDataMap); - } - - private Set getWidgetUsedColumns(String config) { - Set columns = Sets.newHashSet(); - JsonObject configJson = LinkisUtils.gson().fromJson(config, JsonElement.class).getAsJsonObject(); - configJson.getAsJsonArray("rows").forEach(e -> columns.add(getRealColumn(e.getAsJsonObject().get("name").getAsString()))); - configJson.getAsJsonArray("cols").forEach(e -> columns.add(getRealColumn(e.getAsJsonObject().get("name").getAsString()))); - configJson.getAsJsonArray("metrics").forEach(e -> columns.add(getRealColumn(e.getAsJsonObject().get("name").getAsString()))); - return columns; - } - - private String getRealColumn(String wrappedColumn) { - return wrappedColumn.split("@")[0]; - } - - - @Override - public void exportWidgets(Long projectId, Map> moduleIdsMap, boolean partial, ExportedProject exportedProject) { - if (partial) { - Set longs = moduleIdsMap.get(StringConstant.WIDGET_IDS); - if (longs.size() > 0) { - exportedProject.setWidgets(widgetMapper.getByIds(longs)); - exportedProject.setViews(Lists.newArrayList(viewMapper.selectByWidgetIds(longs))); - Set sourceIds = exportedProject.getViews().stream().map(View::getSourceId).collect(Collectors.toSet()); - List sources = sourceMapper.getByProject(projectId).stream().filter(s -> sourceIds.contains(s.getId())).collect(Collectors.toList()); - exportedProject.setSources(sources); - } - - } else { - exportedProject.setWidgets(widgetMapper.getByProject(projectId)); - exportedProject.setSources(sourceMapper.getByProject(projectId)); - List exportedViews = Lists.newArrayList(); - for (Source source : exportedProject.getSources()) { - exportedViews.addAll(viewMapper.getBySourceId(source.getId())); - } - exportedProject.setViews(exportedViews); - } - log.info("exporting project, export widgets: {}", exportedProject); - } - - @Override - public void importWidget(Long projectId, String versionSuffix, ExportedProject exportedProject, IdCatalog idCatalog) { - List widgets = exportedProject.getWidgets(); - if (widgets == null) { - return; - } - for (Widget widget : widgets) { - Long oldId = widget.getId(); - widget.setProjectId(projectId); - widget.setName(Utils.updateName(widget.getName(), versionSuffix)); - widget.setViewId(idCatalog.getView().get(widget.getViewId())); - Long existingId = widgetMapper.getByNameWithProjectId(widget.getName(), projectId); - if (existingId != null) { - idCatalog.getWidget().put(oldId, existingId); - } else { - widgetMapper.insert(widget); - idCatalog.getWidget().put(oldId, widget.getId()); - } - } - } - - @SuppressWarnings("unchecked") - @Override - public void copyWidget(String contextIdStr, Map> moduleIdsMap, ExportedProject exportedProject) throws Exception { - Set widgetIds = moduleIdsMap.get(StringConstant.WIDGET_IDS); - //将widget新的contextId和名字进行替换 - if (!widgetIds.isEmpty()) { - List widgetLists = Lists.newArrayList(); - for (Widget widgetItem : exportedProject.getWidgets()) { - if (widgetItem != null) { - Widget newWidget = widgetItem; - // 获取config内容,转换成map - Map configMap = LinkisUtils.gson().fromJson(newWidget.getConfig(), Map.class); - // 获取上下文id - String encodedContextId = QueryUtils.encodeContextId(contextIdStr); - String nodeName = (String) configMap.get(StringConstant.NODE_NAME); - // 从map中获取VirtualView - if (StringUtils.isNotBlank(nodeName)) { - VirtualView virtualView = Iterables.getFirst(QueryUtils.getFromContext(encodedContextId, nodeName), null); - if (virtualView != null) { - configMap.put(StringConstant.VIEW, virtualView); - } - } - - if (configMap.get(StringConstant.VIEW) != null && !(configMap.get(StringConstant.VIEW) instanceof VirtualView)) { - Object viewVal = configMap.get(StringConstant.VIEW); - // 判断拿到的结构是否是map结构,可能存在不是map的情况 - if (viewVal != null && viewVal.toString().matches("^([-+])?\\d+(\\.\\d+)?$")) { - widgetLists.add(newWidget); - } - } else { - // 拿到viewMap - Map viewMap = (Map) configMap.get(StringConstant.VIEW); - Map sourceMap = (Map) viewMap.get(StringConstant.SOURCE); - Map dataSourceContentMap = (Map) sourceMap.get(StringConstant.DATASOURCE_CONTENT); - dataSourceContentMap.put(StringConstant.CONTEXT_ID, contextIdStr); - sourceMap.put(StringConstant.DATASOURCE_CONTENT, dataSourceContentMap); - viewMap.put(StringConstant.SOURCE, sourceMap); - configMap.put(StringConstant.VIEW, viewMap); - } - configMap.put(StringConstant.CONTEXT_ID, QueryUtils.encodeContextId(contextIdStr)); - newWidget.setConfig(LinkisUtils.gson().toJson(configMap)); - widgetLists.add(newWidget); - } - exportedProject.setWidgets(widgetLists); - } - } - } -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpClientUtil.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpClientUtil.java index 266325196..f8d9eef3f 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpClientUtil.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpClientUtil.java @@ -1,3 +1,19 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.utils; import org.apache.http.Consts; @@ -31,41 +47,43 @@ import java.util.*; import java.util.Map.Entry; +/** + * Created by shanhuang on 2019/1/23. + */ @SuppressWarnings("all") public final class HttpClientUtil { - private final static Logger logger = LoggerFactory.getLogger(HttpClientUtil.class); - public final static int connectTimeout = 5000; - private static PoolingHttpClientConnectionManager connManager = null; - private static CloseableHttpClient httpclient = null; - - /** - * 重写验证方法,取消检测ssl - */ - private static TrustManager trustAllManager = new X509TrustManager() { - @Override - public void checkClientTrusted(java.security.cert.X509Certificate[] arg0, String arg1) - throws CertificateException { - } - - @Override - public void checkServerTrusted(java.security.cert.X509Certificate[] arg0, String arg1) - throws CertificateException { - } - - @Override - public java.security.cert.X509Certificate[] getAcceptedIssuers() { - return null; - } - - }; - - static { - httpclient = HttpClients.createDefault(); - } - - + private final static Logger logger = LoggerFactory.getLogger(HttpClientUtil.class); + public final static int connectTimeout = 5000; + private static PoolingHttpClientConnectionManager connManager = null; + private static CloseableHttpClient httpclient = null; + + /** + * 重写验证方法,取消检测ssl + */ + private static TrustManager trustAllManager = new X509TrustManager() { + @Override + public void checkClientTrusted(java.security.cert.X509Certificate[] arg0, String arg1) + throws CertificateException { + } + @Override + public void checkServerTrusted(java.security.cert.X509Certificate[] arg0, String arg1) + throws CertificateException { + } + @Override + public java.security.cert.X509Certificate[] getAcceptedIssuers() { + return null; + } + + }; + + static { + httpclient = HttpClients.createDefault(); + } + + /** + * * @param url * @param timeout * @param headerMap @@ -73,14 +91,14 @@ public java.security.cert.X509Certificate[] getAcceptedIssuers() { * @param encoding * @return */ - public static String postForm(String url, int timeout, Map headerMap, List paramsList, String encoding) { + public static String postForm(String url, int timeout, Map headerMap, List paramsList, String encoding){ HttpPost post = new HttpPost(url); try { - if (headerMap != null) { - for (Entry entry : headerMap.entrySet()) { - post.setHeader(entry.getKey(), entry.getValue().toString()); + if(headerMap != null){ + for(Entry entry : headerMap.entrySet()){ + post.setHeader(entry.getKey(), entry.getValue().toString()); } - } + } //post.setHeader("Content-type", "application/json"); RequestConfig requestConfig = RequestConfig.custom() .setSocketTimeout(timeout) @@ -94,365 +112,366 @@ public static String postForm(String url, int timeout, Map heade try { HttpEntity entity = response.getEntity(); try { - if (entity != null) { - String str = EntityUtils.toString(entity, encoding); - return str; - } - } finally { - if (entity != null) { - entity.getContent().close(); - } - } - } finally { - if (response != null) { - response.close(); - } - } - } catch (Exception e) { - throw new RuntimeException("invoke http post error!", e); - } finally { - post.releaseConnection(); - } - return ""; - } - - /** - * 调用saltapi时 - * - * @author: XIEJIAN948@pingan.com.cn - */ - public static String postJsonBody(String url, int timeout, Map headerMap, - String paraData, String encoding) { - - logger.info("successfully start post Json Body url{} ", url); - HttpPost post = new HttpPost(url); - try { - if (headerMap != null) { - for (Entry entry : headerMap.entrySet()) { - post.setHeader(entry.getKey(), entry.getValue().toString()); - } - } - RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) - .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); - StringEntity jsonEntity = new StringEntity(paraData, ContentType.APPLICATION_JSON); - post.setConfig(requestConfig); - post.setEntity(jsonEntity); - CloseableHttpResponse response = httpclient.execute(post); - try { - HttpEntity entity = response.getEntity(); - try { - if (entity != null) { + if(entity != null){ String str = EntityUtils.toString(entity, encoding); return str; } } finally { - if (entity != null) { + if(entity != null){ entity.getContent().close(); } } } finally { - if (response != null) { + if(response != null){ response.close(); } } - } catch (UnsupportedEncodingException e) { - logger.error("UnsupportedEncodingException", e); - throw new RuntimeException("failed post json return blank!"); } catch (Exception e) { - logger.error("Exception", e); - throw new RuntimeException("failed post json return blank!"); + throw new RuntimeException("invoke http post error!",e); } finally { post.releaseConnection(); } - logger.info("successfully end post Json Body url{} ", url); return ""; } - @SuppressWarnings("deprecation") - public static String invokeGet(String url, Map params, String encode, int connectTimeout, - int soTimeout) { - String responseString = null; - RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(connectTimeout) - .setConnectTimeout(connectTimeout).setConnectionRequestTimeout(connectTimeout).build(); - - StringBuilder sb = new StringBuilder(); - sb.append(url); - int i = 0; - if (params != null) { - for (Entry entry : params.entrySet()) { - if (i == 0 && !url.contains("?")) { - sb.append("?"); - } else { - sb.append("&"); - } - sb.append(entry.getKey()); - sb.append("="); - String value = entry.getValue(); - try { - sb.append(URLEncoder.encode(value, "UTF-8")); - } catch (UnsupportedEncodingException e) { - logger.warn("encode http get params error, value is " + value, e); - sb.append(URLEncoder.encode(value)); - } - i++; - } - } - HttpGet get = new HttpGet(sb.toString()); - get.setConfig(requestConfig); - try { - CloseableHttpResponse response = httpclient.execute(get); - try { - HttpEntity entity = response.getEntity(); - try { - if (entity != null) { - responseString = EntityUtils.toString(entity, encode); - } - } finally { - if (entity != null) { - entity.getContent().close(); - } - } - } catch (Exception e) { - logger.error(String.format("[HttpUtils Get]get response error, url:%s", sb.toString()), e); - return responseString; - } finally { - if (response != null) { - response.close(); - } - } - // System.out.println(String.format("[HttpUtils Get]Debug url:%s , - // response string %s:", sb.toString(), responseString)); - } catch (SocketTimeoutException e) { - logger.error(String.format("[HttpUtils Get]invoke get timout error, url:%s", sb.toString()), e); - return responseString; - } catch (Exception e) { - logger.error(String.format("[HttpUtils Get]invoke get error, url:%s", sb.toString()), e); - } finally { - get.releaseConnection(); - } - return responseString; - } - - /** - * HTTPS请求,默认超时为5S - * - * @param reqURL - * @param params - * @return - */ - public static String connectPostHttps(String reqURL, Map params) { - - String responseContent = null; - HttpPost httpPost = new HttpPost(reqURL); - try { - RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(connectTimeout) - .setConnectTimeout(connectTimeout).setConnectionRequestTimeout(connectTimeout).build(); - List formParams = new ArrayList(); - httpPost.setEntity(new UrlEncodedFormEntity(formParams, Consts.UTF_8)); - httpPost.setConfig(requestConfig); - // 绑定到请求 Entry - for (Entry entry : params.entrySet()) { - formParams.add(new BasicNameValuePair(entry.getKey(), entry.getValue())); - } - CloseableHttpResponse response = httpclient.execute(httpPost); - try { - // 执行POST请求 - HttpEntity entity = response.getEntity(); // 获取响应实体 - try { - if (null != entity) { - responseContent = EntityUtils.toString(entity, Consts.UTF_8); - } - } finally { - if (entity != null) { - entity.getContent().close(); - } - } - } finally { - if (response != null) { - response.close(); - } - } - logger.info("requestURI : " + httpPost.getURI() + ", responseContent: " + responseContent); - } catch (ClientProtocolException e) { - logger.error("ClientProtocolException", e); - } catch (IOException e) { - logger.error("IOException", e); - } finally { - httpPost.releaseConnection(); - } - return responseContent; - - } - - class Test { - String v; - String k; - - public String getV() { - return v; - } - - public void setV(String v) { - this.v = v; - } - - public String getK() { - return k; - } - - public void setK(String k) { - this.k = k; - } - - } - - // 随机4位数 - public static String getRandomValue() { - String str = "0123456789"; - StringBuilder sb = new StringBuilder(4); - for (int i = 0; i < 4; i++) { - char ch = str.charAt(new Random().nextInt(str.length())); - sb.append(ch); - } - return sb.toString(); - } - - // 当前时间到秒 - public static String getTimestamp() { - - Date date = new Date(); - String timestamp = String.valueOf(date.getTime() / 1000); - return timestamp; - } - - // 当前时间到秒 - public static String getNowDate() { - Date date = new Date(); - SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMddHHmmss"); - return sdf.format(date); - } - - /** - * 调用saltapi时 - * - * @author: XIEJIAN948@pingan.com.cn - */ - public static String postJsonBody2(String url, int timeout, Map headerMap, + /** + * 调用saltapi时 + * + * @author: XIEJIAN948@pingan.com.cn + */ + public static String postJsonBody(String url, int timeout, Map headerMap, + String paraData, String encoding) { + + logger.info("successfully start post Json Body url{} ", url); + HttpPost post = new HttpPost(url); + try { + if (headerMap != null) { + for (Entry entry : headerMap.entrySet()) { + post.setHeader(entry.getKey(), entry.getValue().toString()); + } + } + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) + .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); + StringEntity jsonEntity = new StringEntity(paraData, ContentType.APPLICATION_JSON); + post.setConfig(requestConfig); + post.setEntity(jsonEntity); + CloseableHttpResponse response = httpclient.execute(post); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + String str = EntityUtils.toString(entity, encoding); + return str; + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + } catch (UnsupportedEncodingException e) { + logger.error("UnsupportedEncodingException", e); + throw new RuntimeException("failed post json return blank!"); + } catch (Exception e) { + logger.error("Exception", e); + throw new RuntimeException("failed post json return blank!"); + } finally { + post.releaseConnection(); + } + logger.info("successfully end post Json Body url{} ", url); + return ""; + } + + @SuppressWarnings("deprecation") + public static String invokeGet(String url, Map params, String encode, int connectTimeout, + int soTimeout) { + String responseString = null; + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(connectTimeout) + .setConnectTimeout(connectTimeout).setConnectionRequestTimeout(connectTimeout).build(); + + StringBuilder sb = new StringBuilder(); + sb.append(url); + int i = 0; + if (params != null) { + for (Entry entry : params.entrySet()) { + if (i == 0 && !url.contains("?")) { + sb.append("?"); + } else { + sb.append("&"); + } + sb.append(entry.getKey()); + sb.append("="); + String value = entry.getValue(); + try { + sb.append(URLEncoder.encode(value, "UTF-8")); + } catch (UnsupportedEncodingException e) { + logger.warn("encode http get params error, value is " + value, e); + sb.append(URLEncoder.encode(value)); + } + i++; + } + } + HttpGet get = new HttpGet(sb.toString()); + get.setConfig(requestConfig); + try { + CloseableHttpResponse response = httpclient.execute(get); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + responseString = EntityUtils.toString(entity, encode); + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } catch (Exception e) { + logger.error(String.format("[HttpUtils Get]get response error, url:%s", sb.toString()), e); + return responseString; + } finally { + if (response != null) { + response.close(); + } + } + // System.out.println(String.format("[HttpUtils Get]Debug url:%s , + // response string %s:", sb.toString(), responseString)); + } catch (SocketTimeoutException e) { + logger.error(String.format("[HttpUtils Get]invoke get timout error, url:%s", sb.toString()), e); + return responseString; + } catch (Exception e) { + logger.error(String.format("[HttpUtils Get]invoke get error, url:%s", sb.toString()), e); + } finally { + get.releaseConnection(); + } + return responseString; + } + + /** + * HTTPS请求,默认超时为5S + * + * @param reqURL + * @param params + * @return + */ + public static String connectPostHttps(String reqURL, Map params) { + + String responseContent = null; + HttpPost httpPost = new HttpPost(reqURL); + try { + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(connectTimeout) + .setConnectTimeout(connectTimeout).setConnectionRequestTimeout(connectTimeout).build(); + List formParams = new ArrayList(); + httpPost.setEntity(new UrlEncodedFormEntity(formParams, Consts.UTF_8)); + httpPost.setConfig(requestConfig); + // 绑定到请求 Entry + for (Entry entry : params.entrySet()) { + formParams.add(new BasicNameValuePair(entry.getKey(), entry.getValue())); + } + CloseableHttpResponse response = httpclient.execute(httpPost); + try { + // 执行POST请求 + HttpEntity entity = response.getEntity(); // 获取响应实体 + try { + if (null != entity) { + responseContent = EntityUtils.toString(entity, Consts.UTF_8); + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + logger.info("requestURI : " + httpPost.getURI() + ", responseContent: " + responseContent); + } catch (ClientProtocolException e) { + logger.error("ClientProtocolException", e); + } catch (IOException e) { + logger.error("IOException", e); + } finally { + httpPost.releaseConnection(); + } + return responseContent; + + } + + class Test { + String v; + String k; + + public String getV() { + return v; + } + + public void setV(String v) { + this.v = v; + } + + public String getK() { + return k; + } + + public void setK(String k) { + this.k = k; + } + + } + + // 随机4位数 + public static String getRandomValue() { + String str = "0123456789"; + StringBuilder sb = new StringBuilder(4); + for (int i = 0; i < 4; i++) { + char ch = str.charAt(new Random().nextInt(str.length())); + sb.append(ch); + } + return sb.toString(); + } + + // 当前时间到秒 + public static String getTimestamp() { + + Date date = new Date(); + String timestamp = String.valueOf(date.getTime() / 1000); + return timestamp; + } + + // 当前时间到秒 + public static String getNowDate() { + Date date = new Date(); + SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMddHHmmss"); + return sdf.format(date); + } + + /** + * 调用saltapi时 + * + * @author: XIEJIAN948@pingan.com.cn + */ + public static String postJsonBody2(String url, int timeout, Map headerMap, List paramsList, String encoding) { - logger.info("successfully start post Json Body url{} ", url); - HttpPost post = new HttpPost(url); - try { - if (headerMap != null) { - for (Entry entry : headerMap.entrySet()) { - post.setHeader(entry.getKey(), entry.getValue().toString()); - } - } - RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) - .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); - post.setConfig(requestConfig); - if (paramsList.size() > 0) { - UrlEncodedFormEntity entity = new UrlEncodedFormEntity(paramsList, encoding); - post.setEntity(entity); - } - CloseableHttpResponse response = httpclient.execute(post); - try { - HttpEntity entity = response.getEntity(); - try { - if (entity != null) { - String str = EntityUtils.toString(entity, encoding); - return str; - } - } finally { - if (entity != null) { - entity.getContent().close(); - } - } - } finally { - if (response != null) { - response.close(); - } - } - } catch (UnsupportedEncodingException e) { - logger.error("UnsupportedEncodingException", e); - throw new RuntimeException("failed post json return blank!"); - } catch (Exception e) { - logger.error("Exception", e); - throw new RuntimeException("failed post json return blank!"); - } finally { - post.releaseConnection(); - } - logger.info("successfully end post Json Body url{} ", url); - return ""; - } - - /** - * 调用saltapi时 - * - * @author: XIEJIAN948@pingan.com.cn - */ - public static String postJsonBody3(String url, int timeout, Map headerMap, - Map paramsList, String encoding) { - HttpPost post = new HttpPost(url); - try { - if (headerMap != null) { - for (Entry entry : headerMap.entrySet()) { - post.setHeader(entry.getKey(), entry.getValue().toString()); - } - } - RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) - .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); - post.setConfig(requestConfig); - if (paramsList.size() > 0) { - //JSONArray jsonArray = JSONArray.fromObject(paramsList); - //post.setEntity(new StringEntity(jsonArray.get(0).toString(), encoding)); - post.setEntity(new StringEntity(null, encoding)); - //logger.info("successfully start post Json Body url{},params ", url,jsonArray.get(0).toString()); - logger.info("successfully start post Json Body url{},params ", url, null); - } - CloseableHttpResponse response = httpclient.execute(post); - try { - HttpEntity entity = response.getEntity(); - try { - if (entity != null) { - String str = EntityUtils.toString(entity, encoding); - return str; - } - } finally { - if (entity != null) { - entity.getContent().close(); - } - } - } finally { - if (response != null) { - response.close(); - } - } - } catch (UnsupportedEncodingException e) { - logger.error("UnsupportedEncodingException", e); - throw new RuntimeException("failed post json return blank!"); - } catch (Exception e) { - logger.error("Exception", e); - throw new RuntimeException("failed post json return blank!"); - } finally { - post.releaseConnection(); - } - logger.info("successfully end post Json Body url{} ", url); - return ""; - } - - public static String executeGet(String url) { - String rtnStr = ""; - HttpGet httpGet = new HttpGet(url); - try { - HttpResponse httpResponse = httpclient.execute(httpGet); - //获得返回的结果 - rtnStr = EntityUtils.toString(httpResponse.getEntity()); - } catch (IOException e) { - e.printStackTrace(); - } finally { - httpGet.releaseConnection(); - } - return rtnStr; - } + logger.info("successfully start post Json Body url{} ", url); + HttpPost post = new HttpPost(url); + try { + if (headerMap != null) { + for (Entry entry : headerMap.entrySet()) { + post.setHeader(entry.getKey(), entry.getValue().toString()); + } + } + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) + .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); + post.setConfig(requestConfig); + if (paramsList.size() > 0) { + UrlEncodedFormEntity entity = new UrlEncodedFormEntity(paramsList, encoding); + post.setEntity(entity); + } + CloseableHttpResponse response = httpclient.execute(post); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + String str = EntityUtils.toString(entity, encoding); + return str; + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + } catch (UnsupportedEncodingException e) { + logger.error("UnsupportedEncodingException", e); + throw new RuntimeException("failed post json return blank!"); + } catch (Exception e) { + logger.error("Exception", e); + throw new RuntimeException("failed post json return blank!"); + } finally { + post.releaseConnection(); + } + logger.info("successfully end post Json Body url{} ", url); + return ""; + } + + /** + * 调用saltapi时 + * + * @author: XIEJIAN948@pingan.com.cn + */ + public static String postJsonBody3(String url, int timeout, Map headerMap, + Map paramsList, String encoding) { + HttpPost post = new HttpPost(url); + try { + if (headerMap != null) { + for (Entry entry : headerMap.entrySet()) { + post.setHeader(entry.getKey(), entry.getValue().toString()); + } + } + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) + .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); + post.setConfig(requestConfig); + if (paramsList.size() > 0) { + //JSONArray jsonArray = JSONArray.fromObject(paramsList); + //post.setEntity(new StringEntity(jsonArray.get(0).toString(), encoding)); + post.setEntity(new StringEntity(null, encoding)); + //logger.info("successfully start post Json Body url{},params ", url,jsonArray.get(0).toString()); + logger.info("successfully start post Json Body url{},params ", url,null); + } + CloseableHttpResponse response = httpclient.execute(post); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + String str = EntityUtils.toString(entity, encoding); + return str; + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + } catch (UnsupportedEncodingException e) { + logger.error("UnsupportedEncodingException", e); + throw new RuntimeException("failed post json return blank!"); + } catch (Exception e) { + logger.error("Exception", e); + throw new RuntimeException("failed post json return blank!"); + } finally { + post.releaseConnection(); + } + logger.info("successfully end post Json Body url{} ", url); + return ""; + } + + public static String executeGet(String url) + { + String rtnStr = ""; + HttpGet httpGet = new HttpGet(url); + try { + HttpResponse httpResponse = httpclient.execute(httpGet); + //获得返回的结果 + rtnStr = EntityUtils.toString(httpResponse.getEntity()); + } catch (IOException e) { + e.printStackTrace(); + } finally { + httpGet.releaseConnection(); + } + return rtnStr; + } } diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpUtils.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpUtils.java index 9d073b502..c4451a211 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpUtils.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/HttpUtils.java @@ -1,5 +1,20 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.utils; - import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; import org.apache.http.client.CookieStore; import org.apache.http.client.methods.CloseableHttpResponse; @@ -10,73 +25,75 @@ import org.apache.http.impl.client.HttpClientBuilder; import org.apache.http.impl.cookie.BasicClientCookie; import org.apache.http.util.EntityUtils; -import org.apache.linkis.errorcode.client.ClientConfiguration; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import javax.servlet.http.Cookie; import javax.servlet.http.HttpServletRequest; import java.io.IOException; -import java.net.URI; import java.net.URISyntaxException; import java.util.Date; +/** + * created by cooperyang on 2019/1/23 + * Description: + */ public class HttpUtils { - private static final String GATEWAY_URL = ClientConfiguration.getGatewayUrl(); + private static final String GATEWAY_URL = CommonConfig.GATEWAY_PROTOCOL().getValue() + + CommonConfig.GATEWAY_IP().getValue() + ":" + CommonConfig.GATEWAY_PORT().getValue(); private static final String DATABASE_URL = GATEWAY_URL + CommonConfig.DB_URL_SUFFIX().getValue(); private static final String TABLE_URL = GATEWAY_URL + CommonConfig.TABLE_URL_SUFFIX().getValue(); - private static final String COLUMN_URL = GATEWAY_URL + CommonConfig.COLUMN_URL_SUFFIX().getValue(); + private static final String COLUMN_URL = GATEWAY_URL + CommonConfig.COLUMN_URL_SUFFIX().getValue(); private static final Logger logger = LoggerFactory.getLogger(HttpUtils.class); - - public static String getDbs(String ticketId) { + public static String getDbs(String ticketId){ logger.info("开始进行获取hive数据库的信息"); CookieStore cookieStore = new BasicCookieStore(); CloseableHttpClient httpClient = HttpClientBuilder.create().setDefaultCookieStore(cookieStore).build(); HttpGet httpGet = new HttpGet(DATABASE_URL); BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); cookie.setVersion(0); - cookie.setDomain(getIpFromUrl(GATEWAY_URL)); + cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); cookie.setPath("/"); - cookie.setExpiryDate(new Date(System.currentTimeMillis() + 1000 * 60 * 60 * 24 * 30L)); + cookie.setExpiryDate(new Date(System.currentTimeMillis() + 1000*60*60*24*30L)); cookieStore.addCookie(cookie); String hiveDBJson = null; - try { + try{ CloseableHttpResponse response = httpClient.execute(httpGet); hiveDBJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (IOException e) { - logger.error("通过HTTP方式获取Hive数据库信息失败, reason:", e); + }catch(IOException e){ + logger.error("通过HTTP方式获取Hive数据库信息失败, reason:" , e); } return hiveDBJson; } - public static String getTables(String ticketId, String hiveDBName) { + public static String getTables(String ticketId, String hiveDBName){ logger.info("开始获取hive数据库{} 相关的表以及字段信息", hiveDBName); String tableJson = null; try { - URIBuilder uriBuilder = new URIBuilder(TABLE_URL); + URIBuilder uriBuilder = new URIBuilder(TABLE_URL); uriBuilder.addParameter("database", hiveDBName); CookieStore cookieStore = new BasicCookieStore(); CloseableHttpClient httpClient = HttpClientBuilder.create().setDefaultCookieStore(cookieStore).build(); HttpGet httpGet = new HttpGet(uriBuilder.build()); BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); cookie.setVersion(0); - cookie.setDomain(getIpFromUrl(GATEWAY_URL)); + cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); cookie.setPath("/"); cookieStore.addCookie(cookie); CloseableHttpResponse response = httpClient.execute(httpGet); tableJson = EntityUtils.toString(response.getEntity(), "UTF-8"); } catch (URISyntaxException e) { logger.error("{} url 有问题", TABLE_URL, e); - } catch (IOException e) { + } catch(IOException e){ logger.error("获取hive数据库 {} 下面的表失败了", hiveDBName, e); } return tableJson; } - public static String getColumns(String dbName, String tableName, String ticketId) { + public static String getColumns(String dbName, String tableName, String ticketId){ logger.info("开始获取hive数据库表 {}.{} 的字段信息", dbName, tableName); String columnJson = null; - try { + try{ URIBuilder uriBuilder = new URIBuilder(COLUMN_URL); uriBuilder.addParameter("database", dbName); uriBuilder.addParameter("table", tableName); @@ -85,43 +102,31 @@ public static String getColumns(String dbName, String tableName, String ticketId HttpGet httpGet = new HttpGet(uriBuilder.build()); BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); cookie.setVersion(0); - cookie.setDomain(getIpFromUrl(GATEWAY_URL)); + cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); cookie.setPath("/"); cookieStore.addCookie(cookie); CloseableHttpResponse response = httpClient.execute(httpGet); columnJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (final URISyntaxException e) { + }catch(final URISyntaxException e){ logger.error("{} url 有问题", COLUMN_URL, e); - } catch (final IOException e) { + }catch(final IOException e){ logger.error("获取hive数据库 {}.{} 字段信息失败 ", dbName, tableName, e); } return columnJson; } - public static String getUserTicketId(HttpServletRequest request) { + public static String getUserTicketId(HttpServletRequest request){ Cookie[] cookies = request.getCookies(); String ticketId = null; - for (Cookie cookie : cookies) { - if (CommonConfig.TICKET_ID_STRING().getValue().equalsIgnoreCase(cookie.getName())) { - ticketId = cookie.getValue(); + for (Cookie cookie : cookies){ + if(CommonConfig.TICKET_ID_STRING().getValue().equalsIgnoreCase(cookie.getName())){ + ticketId = cookie.getValue(); break; } } return ticketId; } - // http://127.0.0.1:9001 - private static String getIpFromUrl(String url) { - URI uri = URI.create(url); - String ip = ""; - int port = 0; - if (uri != null) { - ip = uri.getHost(); - port = uri.getPort(); - } - return ip; - } - } diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/StringConstant.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/StringConstant.java deleted file mode 100644 index 9a36b3f87..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/StringConstant.java +++ /dev/null @@ -1,131 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.utils; - -public class StringConstant { - - public final static String ID = "id"; - - public final static String PROJECT = "project"; - - public final static String PROJECT_ID = "projectId"; - - public final static String PARTIAL = "partial"; - - public final static String RESOURCE_ID = "resourceId"; - - public final static String VERSION = "version"; - - public final static String BML_FILE_PREFIX = "visualis_exported_"; - - public final static String COMMA = ","; - - public final static String PROJECT_VERSION = "projectVersion"; - - public final static String PROJECT_VERSION_DEFAULT_VALUE = "v1"; - - public final static String WORKFLOW_VERSION = "flowVersion"; - - public final static String WORKFLOW_VERSION_DEFAULT_VALUE = "v00001"; - - public final static String CONTEXT_ID = "contextID"; - - public final static String WIDGET = "widget"; - - public final static String WIDGETS = "widgets"; - - public final static String WIDGET_ID = "widgetId"; - - public final static String DASHBOARD = "dashboard"; - - public final static String DASHBOARDS = "dashboards"; - - public final static String DASHBOARD_ID = "dashboardId"; - - public final static String DASHBOARD_PORTAL = "dashboardPortal"; - - public final static String PORTAL = "portal"; - - public final static String DISPLAY = "display"; - - public final static String VIEW = "view"; - - public final static String VIEW_ID = "viewId"; - - public final static String SOURCE = "source"; - - public final static String SOURCE_CONTENT = "dataSourceContent"; - - public final static String DASHBOARD_PORTAL_IDS = "dashboardPortalIds"; - - public final static String DISPLAY_IDS = "displayIds"; - - public final static String WIDGET_IDS = "widgetIds"; - - public final static String VIEW_IDS = "viewIds"; - - public final static String NODE_NAME = "nodeName"; - - public final static String URL = "url"; - - public final static String NAME = "name"; - - public final static String UPDATE = "updated"; - - public final static String ALIAS = "alias"; - - public final static String TYPE = "type"; - - public final static String QUERY = "query"; - - public final static String VALUE_TYPE = "valueType"; - - public final static String STRING_LOWERCASE = "string"; - - public final static String STRING_UPPERCASE = "STRING"; - - public final static String SQL_TYPE = "sqlType"; - - public final static String ENGINE_TYPE = "engineTypes"; - - public final static String UDF = "udf"; - - public final static String COLUMNS = "columns"; - - public final static String ROWS = "rows"; - - public final static String COLS = "cols"; - - public final static String METRICS = "metrics"; - - public final static String RESULT_LIST = "resultList"; - - public final static String VARIABLE_TOKEN_SPLIT = "--@team"; - - public final static String DEFAULT_VALUES = "defaultValues"; - - public final static String KEY = "key"; - - public final static String REF_NODE_NAME = "refNodeName"; - - public static final String DELIM_START = "{"; - - public static final String DELIM_END = "}"; - - public static final String EQUAL_SIGN = "="; - - public static final String SINGLE_QUOTES = "'"; - - public static final String EMPTY = ""; - - public static final String UNDERLINE = "_"; - - // view - public static final String PAGINATE_EXEC_STATUS = "paginateWithExecStatus"; - - public static final String IS_LINKIS_DATASOURCE = "isLinkisDataSource"; - - // param - public static final String PARAM = "params"; - - public static final String DATASOURCE_CONTENT = "dataSourceContent"; - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/VisualisProjectCommonUtil.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/VisualisProjectCommonUtil.java deleted file mode 100644 index f24df6ddc..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/VisualisProjectCommonUtil.java +++ /dev/null @@ -1,32 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.utils; - -import org.apache.commons.lang.StringUtils; - -import java.util.regex.Matcher; -import java.util.regex.Pattern; - -public class VisualisProjectCommonUtil { - - public static String updateName(String name, String versionSuffix) { - if (StringUtils.isBlank(versionSuffix)) { - return name; - } - - //节点截取前面一段字符,如widget_5979_v1_v000007截取widget_5979 - Pattern pattern1 = Pattern.compile("([a-zA-Z]+_\\d+).*"); - Matcher matcher = pattern1.matcher(name); - if (matcher.find()) { - return matcher.group(1) + "_" + versionSuffix; - } else { - //数据源名字匹配 - Pattern pattern2 = Pattern.compile("(\\S+)_v\\S+"); - Matcher matcher2 = pattern2.matcher(name); - if (matcher2.find()) { - return matcher2.group(1) + "_" + versionSuffix; - } - } - return name + "_" + versionSuffix; - } - - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/export/WidgetMigration.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/export/WidgetMigration.java deleted file mode 100644 index ae35e706a..000000000 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/export/WidgetMigration.java +++ /dev/null @@ -1,501 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.utils.export; - -import com.google.gson.JsonArray; -import com.google.gson.JsonElement; -import com.google.gson.JsonObject; -import org.apache.linkis.adapt.LinkisUtils; -import org.apache.commons.lang.StringUtils; - -import java.util.Map; -import java.util.Random; - -import static edp.davinci.common.utils.ScriptUtils.getExecuptParamScriptEngine; -import static edp.davinci.common.utils.ScriptUtils.getViewExecuteParam; - -public class WidgetMigration { - - public static String migrate(String oldConfig, Long viewId) throws Exception { - JsonObject jsonObject = LinkisUtils.gson().fromJson(oldConfig, JsonObject.class); - - JsonObject columnsWidth = jsonObject.getAsJsonObject("columnsWidth"); - JsonArray orders = jsonObject.getAsJsonArray("orders"); - JsonObject columnsShowAsPercent = jsonObject.getAsJsonObject("columnsShowAsPercent"); - - // model - JsonObject model = jsonObject.getAsJsonObject("model"); - for (Map.Entry modelItem : model.entrySet()) { - JsonObject newItem = modelItem.getValue().getAsJsonObject().deepCopy(); - if (modelItem.getValue().getAsJsonObject().get("sqlType") == null) { - newItem.addProperty("sqlType", "STRING"); - } else { - newItem.addProperty("sqlType", modelItem.getValue().getAsJsonObject().get("sqlType").getAsString().toUpperCase()); - } - model.add(modelItem.getKey(), newItem); - } - jsonObject.add("model", model); - - //cols - JsonArray oldCols = jsonObject.getAsJsonArray("cols"); - JsonArray newCols = new JsonArray(); - for (JsonElement col : oldCols) { - String colName = col.getAsString(); - JsonObject newCol = new JsonObject(); - newCol.addProperty("name", colName); - newCol.addProperty("type", model.getAsJsonObject(colName).getAsJsonPrimitive("modelType").getAsString()); - newCol.addProperty("visualType", model.getAsJsonObject(colName).getAsJsonPrimitive("visualType").getAsString()); - - JsonObject field = new JsonObject(); - field.addProperty("alias", ""); - field.addProperty("desc", ""); - field.addProperty("useExpression", false); - newCol.add("field", field); - - JsonObject format = new JsonObject(); - format.addProperty("formatType", "default"); - newCol.add("format", format); - - if (orders != null && orders.size() > 0) { - for (JsonElement order : orders) { - if (colName.equals(order.getAsJsonObject().getAsJsonPrimitive("column").getAsString())) { - JsonObject sort = new JsonObject(); - sort.add("sortType", order.getAsJsonObject().getAsJsonPrimitive("direction")); - newCol.add("sort", sort); - break; - } - } - } - - if (columnsWidth != null && columnsWidth.has(colName)) { - newCol.add("width", columnsWidth.get(colName)); - newCol.addProperty("widthChanged", true); - newCol.addProperty("alreadySetWidth", true); - //"oldColumnCounts": 5 - } else { - newCol.addProperty("width", 270); - newCol.addProperty("widthChanged", false); - newCol.addProperty("alreadySetWidth", true); - } - newCol.addProperty("from", "cols"); - newCols.add(newCol); - } - jsonObject.add("cols", newCols); - - //rows - JsonArray oldRows = jsonObject.getAsJsonArray("rows"); - JsonArray newRows = new JsonArray(); - for (JsonElement row : oldRows) { - String rowName = row.getAsString(); - JsonObject newRow = new JsonObject(); - newRow.addProperty("name", rowName); - newRow.addProperty("type", model.getAsJsonObject(rowName).getAsJsonPrimitive("modelType").getAsString()); - newRow.addProperty("visualType", model.getAsJsonObject(rowName).getAsJsonPrimitive("visualType").getAsString()); - - JsonObject field = new JsonObject(); - field.addProperty("alias", ""); - field.addProperty("desc", ""); - field.addProperty("useExpression", false); - newRow.add("field", field); - - JsonObject format = new JsonObject(); - format.addProperty("formatType", "default"); - newRow.add("format", format); - - if (orders != null && orders.size() > 0) { - for (JsonElement order : orders) { - if (rowName.equals(order.getAsJsonObject().getAsJsonPrimitive("column").getAsString())) { - JsonObject sort = new JsonObject(); - sort.add("sortType", order.getAsJsonObject().getAsJsonPrimitive("direction")); - newRow.add("sort", sort); - break; - } - } - } - - if (columnsWidth != null && columnsWidth.has(rowName)) { - newRow.add("width", columnsWidth.get(rowName)); - newRow.addProperty("widthChanged", true); - newRow.addProperty("alreadySetWidth", true); - //"oldColumnCounts": 5 - } else { - newRow.addProperty("width", 270); - newRow.addProperty("widthChanged", false); - newRow.addProperty("alreadySetWidth", true); - } - newRow.addProperty("from", "rows"); - newRows.add(newRow); - } - jsonObject.add("rows", newRows); - - - //metrics - JsonArray oldMetrics = jsonObject.getAsJsonArray("metrics"); - int oldColumnCounts = oldCols.size() + oldMetrics.size(); - JsonArray newMetrics = new JsonArray(); - for (JsonElement metric : oldMetrics) { - JsonObject newMetric = metric.deepCopy().getAsJsonObject(); - newMetric.addProperty("name", StringUtils.replace(metric.getAsJsonObject().get("name").getAsString(), "davinci", "Visualis")); - JsonObject oldChart = metric.getAsJsonObject().getAsJsonObject("chart"); - JsonObject newChart = oldChart.deepCopy(); - JsonArray rules = new JsonArray(); - JsonObject rule = new JsonObject(); - rule.add("dimension", oldChart.get("requireDimetions")); - rule.add("metric", oldChart.get("requireMetrics")); - rules.add(rule); - newChart.remove("requireDimetions"); - newChart.remove("requireMetrics"); - newChart.add("rules", rules); - - //for table - if (jsonObject.getAsJsonObject("chartStyles").has("table")) { - newChart.addProperty("name", "table"); - newChart.addProperty("title", "表格"); - newChart.addProperty("icon", "icon-table"); - newChart.addProperty("icon", "icon-table"); - newChart.addProperty("coordinate", "other"); - } - - JsonObject data = new JsonObject(); - JsonObject tmp = new JsonObject(); - tmp.addProperty("title", "列"); - tmp.addProperty("type", "category"); - data.add("cols", tmp); - JsonObject tmp1 = new JsonObject(); - tmp1.addProperty("title", "行"); - tmp1.addProperty("type", "category"); - data.add("rows", tmp1); - JsonObject tmp2 = new JsonObject(); - tmp2.addProperty("title", "指标"); - tmp2.addProperty("type", "value"); - data.add("metrics", tmp2); - JsonObject tmp3 = new JsonObject(); - tmp3.addProperty("title", "筛选"); - tmp3.addProperty("type", "all"); - data.add("filters", tmp3); - JsonObject tmp4 = new JsonObject(); - tmp4.addProperty("title", "颜色"); - tmp4.addProperty("type", "category"); - data.add("color", tmp4); - newChart.add("data", data); - JsonObject style = newChart.getAsJsonObject("style").deepCopy(); - //for table - if (jsonObject.getAsJsonObject("chartStyles").has("table") && style.has("pivot")) { - JsonObject table = new JsonObject(); - for (Map.Entry entry : style.getAsJsonObject("pivot").entrySet()) { - table.add(entry.getKey(), entry.getValue()); - } - table.add("headerConfig", new JsonArray()); - table.add("columnsConfig", new JsonArray()); - table.add("leftFixedColumns", new JsonArray()); - table.add("rightFixedColumns", new JsonArray()); - table.addProperty("headerFixed", true); - table.addProperty("autoMergeCell", false); - table.addProperty("bordered", true); - table.addProperty("size", "default"); - table.addProperty("withPaging", true); - table.addProperty("pageSize", "20"); - table.addProperty("withNoAggregators", false); - style.add("table", table); - style.remove("pivot"); - newMetric.addProperty("oldColumnCounts", oldColumnCounts); - } else { - if (style.getAsJsonObject("table") != null) { - JsonObject table = style.getAsJsonObject("table").deepCopy(); - table.add("headerConfig", new JsonArray()); - table.add("columnsConfig", new JsonArray()); - table.add("leftFixedColumns", new JsonArray()); - table.add("rightFixedColumns", new JsonArray()); - table.addProperty("headerFixed", true); - table.addProperty("autoMergeCell", false); - table.addProperty("bordered", true); - table.addProperty("size", "default"); - table.addProperty("withPaging", true); - table.addProperty("pageSize", "20"); - table.addProperty("withNoAggregators", false); - style.add("table", table); - } - } - - style.add("spec", new JsonObject()); - //style.add("spec", new JsonObject()); - newChart.add("style", style); - newMetric.add("chart", newChart); - - - JsonObject field = new JsonObject(); - field.addProperty("alias", ""); - field.addProperty("desc", ""); - field.addProperty("useExpression", false); - newMetric.add("field", field); - - JsonObject format = new JsonObject(); - if (columnsShowAsPercent != null && - columnsShowAsPercent.has(metric.getAsJsonObject().get("name").getAsString()) && - columnsShowAsPercent.get(metric.getAsJsonObject().get("name").getAsString()).getAsBoolean()) { - format.addProperty("formatType", "percentage"); - JsonObject percentage = new JsonObject(); - percentage.addProperty("decimalPlaces", 2); - newMetric.add("percentage", percentage); - newMetric.add("format", format); - } else { - format.addProperty("formatType", "default"); - newMetric.add("format", format); - } - - - if (metric.getAsJsonObject().has("displayName")) { - String displayName = metric.getAsJsonObject().get("displayName").getAsString(); - if (columnsWidth.has(displayName.toLowerCase())) { - newMetric.add("width", columnsWidth.get(displayName.toLowerCase())); - newMetric.addProperty("widthChanged", true); - newMetric.addProperty("alreadySetWidth", true); - } else if (columnsWidth.has(displayName.toUpperCase())) { - newMetric.add("width", columnsWidth.get(displayName.toUpperCase())); - newMetric.addProperty("widthChanged", true); - newMetric.addProperty("alreadySetWidth", true); - } else { - if (metric.getAsJsonObject().has("width")) { - newMetric.add("width", metric.getAsJsonObject().get("width")); - newMetric.addProperty("widthChanged", true); - newMetric.addProperty("alreadySetWidth", true); - } else { - newMetric.addProperty("width", 270); - newMetric.addProperty("widthChanged", false); - newMetric.addProperty("alreadySetWidth", true); - } - } - } else { - if (metric.getAsJsonObject().has("width")) { - newMetric.add("width", metric.getAsJsonObject().get("width")); - newMetric.addProperty("widthChanged", true); - newMetric.addProperty("alreadySetWidth", true); - } else { - newMetric.addProperty("width", 270); - newMetric.addProperty("widthChanged", false); - newMetric.addProperty("alreadySetWidth", true); - } - } - - //"oldColumnCounts": 5, - if (metric.getAsJsonObject().has("sort")) { - JsonObject sort = new JsonObject(); - sort.add("sortType", metric.getAsJsonObject().get("sort")); - newMetric.add("sort", sort); - } - newMetric.addProperty("from", "metrics"); - newMetrics.add(newMetric); - } - jsonObject.add("metrics", newMetrics); - - //filters - JsonArray oldFilters = jsonObject.getAsJsonArray("filters"); - JsonArray newFilters = new JsonArray(); - for (JsonElement filter : oldFilters) { - JsonObject newFilter = filter.getAsJsonObject().deepCopy(); - JsonObject config = newFilter.getAsJsonObject("config"); - if (config.get("filterSource").isJsonArray()) { - //value - JsonArray sqlModel = new JsonArray(); - JsonObject sqlModelValue = new JsonObject(); - sqlModelValue.addProperty("name", newFilter.get("name").getAsString()); - sqlModelValue.addProperty("type", "filter"); - sqlModelValue.addProperty("operator", "in"); - if (newFilter.get("visualType").isJsonPrimitive()) { - sqlModelValue.addProperty("sqlType", newFilter.get("visualType").getAsString().toUpperCase()); - } else { - sqlModelValue.addProperty("sqlType", newFilter.get("visualType").getAsJsonObject().get("visualType").getAsString().toUpperCase()); - } - - JsonArray value = new JsonArray(); - for (JsonElement filterValue : config.get("filterSource").getAsJsonArray()) { - //TODO for number type - value.add("'" + filterValue.getAsString() + "'"); - } - sqlModelValue.add("value", value); - sqlModel.add(sqlModelValue); - config.add("sqlModel", sqlModel); - - } else { - //relation - JsonArray sqlModel = convertSourceToModel(config.get("filterSource").getAsJsonObject(), newFilter); - config.add("sqlModel", sqlModel); - } - config.remove("sql"); - newFilter.remove("visualType"); - newFilter.add("config", config); - newFilters.add(newFilter); - } - - - jsonObject.add("filters", newFilters); - - // chartStyles - JsonObject chartStyles = jsonObject.getAsJsonObject("chartStyles"); - if (chartStyles.getAsJsonObject("table") != null) { - JsonObject table = chartStyles.getAsJsonObject("table").deepCopy(); - JsonArray headerConfig = new JsonArray(); - JsonArray columnConfig = new JsonArray(); - for (JsonElement newCol : newCols) { - JsonObject config = new JsonObject(); - config.addProperty("key", Integer.toString(new Random().nextInt(100000))); - config.addProperty("headerName", newCol.getAsJsonObject().get("name").getAsString()); - config.addProperty("alias", newCol.getAsJsonObject().get("name").getAsString()); - config.addProperty("visualType", newCol.getAsJsonObject().get("visualType").getAsString()); - config.addProperty("isGroup", false); - config.add("children", null); - JsonObject style = new JsonObject(); - style.add("fontSize", table.get("fontSize")); - style.add("fontFamily", table.get("fontFamily")); - if (table.get("isHeaderBold") != null && table.get("isHeaderBold").getAsBoolean()) { - style.addProperty("fontWeight", "bold"); - } else { - style.addProperty("fontWeight", "normal"); - } - style.add("fontColor", table.get("color")); - style.add("backgroundColor", table.get("headerBackgroundColor")); - style.addProperty("justifyContent", "flex-start"); - config.add("style", style); - headerConfig.add(config); - - JsonObject cConfig = new JsonObject(); - cConfig.addProperty("columnName", newCol.getAsJsonObject().get("name").getAsString()); - cConfig.addProperty("alias", newCol.getAsJsonObject().get("name").getAsString()); - cConfig.addProperty("visualType", newCol.getAsJsonObject().get("visualType").getAsString()); - cConfig.addProperty("styleType", 0); - - JsonObject cStyle = new JsonObject(); - cStyle.add("fontSize", table.get("fontSize")); - cStyle.add("fontFamily", table.get("fontFamily")); - if (table.get("isBodyBold") != null && table.get("isBodyBold").getAsBoolean()) { - cStyle.addProperty("fontWeight", "bold"); - } else { - cStyle.addProperty("fontWeight", "normal"); - } - cStyle.add("fontColor", table.get("color")); - cStyle.addProperty("backgroundColor", "transparent"); - cStyle.addProperty("justifyContent", "flex-start"); - cConfig.add("style", cStyle); - cConfig.add("conditionStyles", new JsonArray()); - cConfig.add("width", columnsWidth.get(newCol.getAsJsonObject().get("name").getAsString())); - cConfig.addProperty("alreadySetWidth", true); - cConfig.addProperty("oldColumnCounts", oldColumnCounts); - cConfig.addProperty("widthChanged", false); - columnConfig.add(cConfig); - } - for (JsonElement newMetric : newMetrics) { - JsonObject config = new JsonObject(); - config.addProperty("key", Integer.toString(new Random().nextInt(100000))); - config.addProperty("headerName", newMetric.getAsJsonObject().get("name").getAsString()); - config.addProperty("alias", newMetric.getAsJsonObject().get("name").getAsString()); - config.addProperty("visualType", newMetric.getAsJsonObject().get("visualType").getAsString()); - config.addProperty("isGroup", false); - config.add("children", null); - JsonObject style = new JsonObject(); - style.add("fontSize", table.get("fontSize")); - style.add("fontFamily", table.get("fontFamily")); - if (table.get("isHeaderBold") != null && table.get("isHeaderBold").getAsBoolean()) { - style.addProperty("fontWeight", "bold"); - } else { - style.addProperty("fontWeight", "normal"); - } - style.add("fontColor", table.get("color")); - style.add("backgroundColor", table.get("headerBackgroundColor")); - style.addProperty("justifyContent", "flex-start"); - config.add("style", style); - headerConfig.add(config); - - JsonObject cConfig = new JsonObject(); - cConfig.addProperty("columnName", newMetric.getAsJsonObject().get("name").getAsString()); - cConfig.addProperty("alias", newMetric.getAsJsonObject().get("name").getAsString()); - cConfig.addProperty("visualType", newMetric.getAsJsonObject().get("visualType").getAsString()); - cConfig.addProperty("styleType", 0); - - JsonObject cStyle = new JsonObject(); - cStyle.add("fontSize", table.get("fontSize")); - cStyle.add("fontFamily", table.get("fontFamily")); - if (table.get("isBodyBold") != null && table.get("isBodyBold").getAsBoolean()) { - cStyle.addProperty("fontWeight", "bold"); - } else { - cStyle.addProperty("fontWeight", "normal"); - } - cStyle.add("fontColor", table.get("color")); - cStyle.addProperty("backgroundColor", "transparent"); - cStyle.addProperty("justifyContent", "flex-start"); - cConfig.add("style", cStyle); - cConfig.add("conditionStyles", new JsonArray()); - cConfig.add("width", columnsWidth.get(newMetric.getAsJsonObject().get("name").getAsString())); - cConfig.addProperty("alreadySetWidth", true); - cConfig.addProperty("oldColumnCounts", oldColumnCounts); - cConfig.addProperty("widthChanged", false); - columnConfig.add(cConfig); - } - table.add("headerConfig", headerConfig); - table.add("columnsConfig", columnConfig); - table.add("leftFixedColumns", new JsonArray()); - table.add("rightFixedColumns", new JsonArray()); - table.addProperty("headerFixed", true); - table.addProperty("autoMergeCell", false); - table.addProperty("bordered", true); - table.addProperty("size", "default"); - table.addProperty("withPaging", true); - table.addProperty("pageSize", "20"); - table.addProperty("withNoAggregators", false); - chartStyles.add("table", table); - chartStyles.add("spec", new JsonObject()); - jsonObject.add("chartStyles", chartStyles); - } - - - // pagination - JsonObject pagination = new JsonObject(); - pagination.addProperty("pageNo", 1); - pagination.addProperty("pageSize", 20); - pagination.addProperty("withPaging", true); - pagination.addProperty("totalCount", 0); - jsonObject.add("pagination", pagination); - - jsonObject.add("controls", new JsonArray()); - jsonObject.add("computed", new JsonArray()); - jsonObject.addProperty("cache", false); - jsonObject.addProperty("nativeQuery", false); - jsonObject.addProperty("expired", 300); - jsonObject.addProperty("autoLoadData", true); - - jsonObject.addProperty("view", viewId); - jsonObject.addProperty("contextId", ""); - jsonObject.addProperty("nodeName", ""); - - jsonObject.addProperty("renderType", "clear"); - - - jsonObject.add("query", LinkisUtils.gson().toJsonTree(getViewExecuteParam(getExecuptParamScriptEngine(), null, LinkisUtils.gson().toJson(jsonObject), null))); - - return LinkisUtils.gson().toJson(jsonObject); - } - - public static JsonArray convertSourceToModel(JsonObject filterSource, JsonObject filter) { - JsonArray sqlModel = new JsonArray(); - sqlModel.add(sourceToModelRec(filterSource, filter)); - return sqlModel; - } - - public static JsonObject sourceToModelRec(JsonObject parentSource, JsonObject filter) { - JsonObject parentModel = new JsonObject(); - if (parentSource.get("type").equals("link")) { - parentModel.addProperty("type", "relation"); - parentModel.addProperty("value", parentSource.get("rel").getAsString()); - JsonArray children = new JsonArray(); - for (JsonElement childSource : parentSource.getAsJsonArray("children")) { - children.add(sourceToModelRec(childSource.getAsJsonObject(), filter)); - } - return parentModel; - } else { - parentModel.addProperty("type", "filter"); - parentModel.addProperty("name", filter.get("name").getAsString()); - parentModel.add("value", parentSource.get("filterValue")); - parentModel.add("operator", parentSource.get("filterOperator")); - parentModel.addProperty("sqlType", filter.get("visualType").getAsString().toUpperCase()); - return parentModel; - } - } - -} diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveColumnModel.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveColumnModel.java similarity index 62% rename from server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveColumnModel.java rename to server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveColumnModel.java index 6f37512ca..33897366c 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveColumnModel.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveColumnModel.java @@ -1,9 +1,29 @@ -package com.webank.wedatasphere.dss.visualis.model.hivemodel; +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.utils.model; import java.util.List; -public class HiveColumnModel extends HiveModel { - public static class Column { +/** + * created by cooperyang on 2019/1/26 + * Description: + */ +public class HiveColumnModel extends HiveModel{ + public static class Column{ private String columnName; private String columnType; private String columnComment; @@ -43,7 +63,7 @@ public void setPartitioned(boolean partitioned) { } - public static class Data { + public static class Data{ private List columns; public List getColumns() { diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveDBModel.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveDBModel.java new file mode 100644 index 000000000..08c8b9013 --- /dev/null +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveDBModel.java @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.utils.model; + +import java.util.List; + +/** + * created by cooperyang on 2019/1/24 + * Description: + */ +public class HiveDBModel extends HiveModel{ + public static class HiveDB{ + private String dbName; + + public String getDbName() { + return dbName; + } + + public void setDbName(String dbName) { + this.dbName = dbName; + } + } + + public static class Data{ + private List dbs; + + public List getDbs() { + return dbs; + } + + public void setDbs(List dbs) { + this.dbs = dbs; + } + } + private Data data; + + + public Data getData() { + return data; + } + + public void setData(Data data) { + this.data = data; + } +} + diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveModel.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveModel.java new file mode 100644 index 000000000..ebaccc403 --- /dev/null +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveModel.java @@ -0,0 +1,52 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.utils.model; + +/** + * created by cooperyang on 2019/1/25 + * Description: + */ +public abstract class HiveModel { + private String method; + private int status; + private String message; + + public String getMethod() { + return method; + } + + public void setMethod(String method) { + this.method = method; + } + + public int getStatus() { + return status; + } + + public void setStatus(int status) { + this.status = status; + } + + public String getMessage() { + return message; + } + + public void setMessage(String message) { + this.message = message; + } +} + diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveSchemaModel.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveSchemaModel.java similarity index 60% rename from server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveSchemaModel.java rename to server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveSchemaModel.java index a10d182b7..5763733e1 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveSchemaModel.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveSchemaModel.java @@ -1,7 +1,27 @@ -package com.webank.wedatasphere.dss.visualis.model.hivemodel; +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.utils.model; import java.util.List; +/** + * created by cooperyang on 2019/1/26 + * Description: + */ public class HiveSchemaModel { public static class FrontColumn { private String name; diff --git a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveTableModel.java b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveTableModel.java similarity index 67% rename from server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveTableModel.java rename to server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveTableModel.java index 8884fadb8..d6e0ce198 100644 --- a/server/src/main/java/com/webank/wedatasphere/dss/visualis/model/hivemodel/HiveTableModel.java +++ b/server/src/main/java/com/webank/wedatasphere/dss/visualis/utils/model/HiveTableModel.java @@ -1,11 +1,31 @@ -package com.webank.wedatasphere.dss.visualis.model.hivemodel; +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.utils.model; import java.util.List; -public class HiveTableModel extends HiveModel { +/** + * created by cooperyang on 2019/1/25 + * Description: + */ +public class HiveTableModel extends HiveModel{ - public static class HiveTable { + public static class HiveTable{ private String tableName; private boolean isView; private String databaseName; @@ -62,7 +82,7 @@ public void setLastAccessAt(Long lastAccessAt) { } } - public static class TableData { + public static class TableData{ private List tables; public List getTables() { @@ -83,4 +103,5 @@ public TableData getData() { public void setData(TableData data) { this.data = data; } -} \ No newline at end of file +} + diff --git a/server/src/main/java/edp/DavinciServerApplication.java b/server/src/main/java/edp/DavinciServerApplication.java index 8ab05a9de..db9876981 100644 --- a/server/src/main/java/edp/DavinciServerApplication.java +++ b/server/src/main/java/edp/DavinciServerApplication.java @@ -19,6 +19,7 @@ package edp; +import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.EnableAutoConfiguration; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.scheduling.annotation.EnableScheduling; diff --git a/server/src/main/java/edp/SwaggerConfiguration.java b/server/src/main/java/edp/SwaggerConfiguration.java new file mode 100644 index 000000000..7e7e6ab3b --- /dev/null +++ b/server/src/main/java/edp/SwaggerConfiguration.java @@ -0,0 +1,82 @@ +/* + * << + * Davinci + * == + * Copyright (C) 2016 - 2019 EDP + * == + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * http://www.apache.org/licenses/LICENSE-2.0 + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * >> + * + */ + +package edp; + +import com.google.common.collect.Lists; +import edp.core.consts.Consts; +import edp.core.enums.HttpCodeEnum; +import org.springframework.context.annotation.Bean; +import org.springframework.context.annotation.Configuration; +import org.springframework.web.bind.annotation.RequestMethod; +import springfox.documentation.builders.ApiInfoBuilder; +import springfox.documentation.builders.PathSelectors; +import springfox.documentation.builders.RequestHandlerSelectors; +import springfox.documentation.builders.ResponseMessageBuilder; +import springfox.documentation.service.ApiInfo; +import springfox.documentation.service.ApiKey; +import springfox.documentation.service.ResponseMessage; +import springfox.documentation.spi.DocumentationType; +import springfox.documentation.spring.web.plugins.Docket; +import springfox.documentation.swagger2.annotations.EnableSwagger2; + +import java.util.ArrayList; +import java.util.List; + +@Configuration +@EnableSwagger2 +public class SwaggerConfiguration { + @Bean + public Docket createRestApi() { + + List responseMessageList = new ArrayList<>(); + responseMessageList.add(new ResponseMessageBuilder().code(HttpCodeEnum.OK.getCode()).message(HttpCodeEnum.OK.getMessage()).build()); + responseMessageList.add(new ResponseMessageBuilder().code(HttpCodeEnum.FAIL.getCode()).message(HttpCodeEnum.FAIL.getMessage()).build()); + responseMessageList.add(new ResponseMessageBuilder().code(HttpCodeEnum.UNAUTHORIZED.getCode()).message(HttpCodeEnum.UNAUTHORIZED.getMessage()).build()); + responseMessageList.add(new ResponseMessageBuilder().code(HttpCodeEnum.FORBIDDEN.getCode()).message(HttpCodeEnum.FORBIDDEN.getMessage()).build()); + responseMessageList.add(new ResponseMessageBuilder().code(HttpCodeEnum.SERVER_ERROR.getCode()).message(HttpCodeEnum.SERVER_ERROR.getMessage()).build()); + + + return new Docket(DocumentationType.SWAGGER_2) + .globalResponseMessage(RequestMethod.GET, responseMessageList) + .globalResponseMessage(RequestMethod.POST, responseMessageList) + .globalResponseMessage(RequestMethod.PUT, responseMessageList) + .globalResponseMessage(RequestMethod.DELETE, responseMessageList) + + .apiInfo(apiInfo()) + .select() + .apis(RequestHandlerSelectors.basePackage("edp.davinci.controller")) + .paths(PathSelectors.any()) + .build() + .securitySchemes(Lists.newArrayList(apiKey())); + + } + + private ApiInfo apiInfo() { + return new ApiInfoBuilder() + .title("davinci api") + .version("1.0") + .build(); + } + + private ApiKey apiKey() { + return new ApiKey(Consts.TOKEN_HEADER_STRING, Consts.TOKEN_HEADER_STRING, "header"); + } + +} diff --git a/server/src/main/java/edp/core/annotation/MethodLog.java b/server/src/main/java/edp/core/annotation/MethodLog.java deleted file mode 100644 index 1f8fffa50..000000000 --- a/server/src/main/java/edp/core/annotation/MethodLog.java +++ /dev/null @@ -1,14 +0,0 @@ -package edp.core.annotation; - -import java.lang.annotation.*; - -/** - * 自定义 打印日志接口参数及返回结果 - */ -@Documented -@Retention(RetentionPolicy.RUNTIME) -@Target({ElementType.METHOD}) -public @interface MethodLog { - //0:打印入参+返回结果 1:打印入参 - int type() default 0; -} diff --git a/server/src/main/java/edp/core/config/CaffeineCacheConfig.java b/server/src/main/java/edp/core/config/CaffeineCacheConfig.java index a34a3072b..34bda8101 100644 --- a/server/src/main/java/edp/core/config/CaffeineCacheConfig.java +++ b/server/src/main/java/edp/core/config/CaffeineCacheConfig.java @@ -30,6 +30,7 @@ import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; +import java.lang.reflect.Method; import java.util.ArrayList; import java.util.List; import java.util.concurrent.TimeUnit; diff --git a/server/src/main/java/edp/core/config/DruidConfig.java b/server/src/main/java/edp/core/config/DruidConfig.java index fe9148d8e..f86fe8ba4 100644 --- a/server/src/main/java/edp/core/config/DruidConfig.java +++ b/server/src/main/java/edp/core/config/DruidConfig.java @@ -22,7 +22,6 @@ import com.alibaba.druid.pool.DruidDataSource; import com.alibaba.druid.support.http.StatViewServlet; import com.alibaba.druid.support.http.WebStatFilter; -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Value; import org.springframework.boot.web.servlet.FilterRegistrationBean; @@ -99,7 +98,6 @@ public class DruidConfig { * * @return */ - @SuppressWarnings("unchecked") @Bean public ServletRegistrationBean druidServlet() { ServletRegistrationBean reg = new ServletRegistrationBean(); @@ -115,7 +113,6 @@ public ServletRegistrationBean druidServlet() { * * @return */ - @SuppressWarnings("unchecked") @Bean public FilterRegistrationBean filterRegistrationBean() { FilterRegistrationBean filterRegistrationBean = new FilterRegistrationBean(); @@ -134,16 +131,6 @@ public DruidDataSource druidDataSource() { DruidDataSource druidDataSource = new DruidDataSource(); druidDataSource.setUrl(durl); druidDataSource.setUsername(username); - if((Boolean) CommonConfig.ENABLE_PASSWORD_ENCRYPT().getValue()){ - String pubKey = CommonConfig.LINKIS_MYSQL_PUB_KEY().getValue(); - String priKey = CommonConfig.LINKIS_MYSQL_PRIV_KEY().getValue(); - try { -// password = EncryptUtil.decrypt(priKey, password); - } catch (Exception e) { - log.error("failed to decrypt password for {}", password, e); - System.exit(-2); - } - } druidDataSource.setPassword(password); druidDataSource.setDriverClassName(driverClassName); druidDataSource.setInitialSize(initialSize); diff --git a/server/src/main/java/edp/core/config/RedisConfig.java b/server/src/main/java/edp/core/config/RedisConfig.java index d4f47045f..bca7b86f2 100644 --- a/server/src/main/java/edp/core/config/RedisConfig.java +++ b/server/src/main/java/edp/core/config/RedisConfig.java @@ -39,7 +39,6 @@ public class RedisConfig { @Autowired private BeanFactory beanFactory; - @SuppressWarnings("unchecked") @Bean public RedisTemplate InitRedisTemplate() { RedisTemplate redisTemplate = null; diff --git a/server/src/main/java/edp/core/config/RestClientConfig.java b/server/src/main/java/edp/core/config/RestClientConfig.java index 58dd99752..2d625b966 100644 --- a/server/src/main/java/edp/core/config/RestClientConfig.java +++ b/server/src/main/java/edp/core/config/RestClientConfig.java @@ -112,7 +112,8 @@ public HttpComponentsClientHttpRequestFactory httpComponentsClientHttpRequestFac clientHttpRequestFactory.setConnectionRequestTimeout(connectionRequestTimout); return clientHttpRequestFactory; } catch (NoSuchAlgorithmException | KeyManagementException | KeyStoreException e) { - log.error("Initializing HTTP connection pool ERROR: ", e); + e.printStackTrace(); + log.error("Initializing HTTP connection pool ERROR, {}", e); } return null; } diff --git a/server/src/main/java/edp/core/inteceptor/RequestJsonHandlerArgumentResolver.java b/server/src/main/java/edp/core/inteceptor/RequestJsonHandlerArgumentResolver.java index 66c3f5a29..f7991e8bd 100644 --- a/server/src/main/java/edp/core/inteceptor/RequestJsonHandlerArgumentResolver.java +++ b/server/src/main/java/edp/core/inteceptor/RequestJsonHandlerArgumentResolver.java @@ -31,7 +31,7 @@ import java.io.BufferedReader; /** - * 注解解析器 + * @JsonParam 注解 解析器 */ public class RequestJsonHandlerArgumentResolver implements HandlerMethodArgumentResolver { @Override diff --git a/server/src/main/java/edp/core/model/Paginate.java b/server/src/main/java/edp/core/model/Paginate.java index 4b6e94a96..b22cd8c72 100644 --- a/server/src/main/java/edp/core/model/Paginate.java +++ b/server/src/main/java/edp/core/model/Paginate.java @@ -42,8 +42,4 @@ public class Paginate implements Serializable { public void setResultList(List resultList) { this.resultList = resultList; } - - public void setTotalCount(long totalCount) { - this.totalCount = totalCount; - } } diff --git a/server/src/main/java/edp/core/model/RecordInfo.java b/server/src/main/java/edp/core/model/RecordInfo.java index f2e91a5b3..30d6bf456 100644 --- a/server/src/main/java/edp/core/model/RecordInfo.java +++ b/server/src/main/java/edp/core/model/RecordInfo.java @@ -39,7 +39,6 @@ public class RecordInfo { @JSONField(serialize = false) Date updateTime; - @SuppressWarnings("unchecked") public T createdBy(Long userId) { this.createBy = userId; this.createTime = new Date(); @@ -50,15 +49,4 @@ public void updatedBy(Long userId) { this.updateBy = userId; this.updateTime = new Date(); } - - public void updateByWithoutUpdateTime(Long userId) { - this.updateBy = userId; - } - - public Date getUpdateTime() { - if(updateTime == null){ - return createTime; - } - return updateTime; - } } diff --git a/server/src/main/java/edp/core/utils/CustomDataSourceUtils.java b/server/src/main/java/edp/core/utils/CustomDataSourceUtils.java index 597b18a94..5fd1c0483 100644 --- a/server/src/main/java/edp/core/utils/CustomDataSourceUtils.java +++ b/server/src/main/java/edp/core/utils/CustomDataSourceUtils.java @@ -55,7 +55,6 @@ public static CustomDataSource getInstance(String jdbcUrl, String version) { } - @SuppressWarnings("unchecked") public static void loadAllFromYaml(String yamlPath) throws Exception { if (StringUtils.isEmpty(yamlPath)) { return; diff --git a/server/src/main/java/edp/core/utils/FileUtils.java b/server/src/main/java/edp/core/utils/FileUtils.java index a4806517a..f84b934c8 100644 --- a/server/src/main/java/edp/core/utils/FileUtils.java +++ b/server/src/main/java/edp/core/utils/FileUtils.java @@ -23,8 +23,6 @@ import edp.davinci.core.enums.ActionEnum; import edp.davinci.core.enums.FileTypeEnum; import edp.davinci.service.excel.MsgWrapper; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Value; import org.springframework.stereotype.Component; import org.springframework.web.multipart.MultipartFile; @@ -44,8 +42,6 @@ @Component public class FileUtils { - final static Logger log = LoggerFactory.getLogger(FileUtils.class); - @Value("${file.userfiles-path}") public String fileBasePath; @@ -151,7 +147,7 @@ public void download(String filePath, HttpServletResponse response) { os.write(buffer); os.flush(); } catch (IOException e) { - log.error("Reading the file failed while downloading the file: ", e); + e.printStackTrace(); } finally { try { if (null != is) { @@ -161,7 +157,7 @@ public void download(String filePath, HttpServletResponse response) { os.close(); } } catch (IOException e) { - log.error("Error in closing file: ", e); + e.printStackTrace(); } remove(filePath); } @@ -238,7 +234,7 @@ public static void zipFile(List files, File targetFile) { } out.close(); } catch (Exception e) { - log.error("Compressed file error: ", e); + e.printStackTrace(); } } diff --git a/server/src/main/java/edp/core/utils/MailUtils.java b/server/src/main/java/edp/core/utils/MailUtils.java index 61d4b974f..7a70adf8b 100644 --- a/server/src/main/java/edp/core/utils/MailUtils.java +++ b/server/src/main/java/edp/core/utils/MailUtils.java @@ -94,6 +94,7 @@ public void sendSimpleEmail(String from, String subject, String[] to, String[] c log.info("send mail success, in {} million seconds", System.currentTimeMillis() - startTimestamp); } catch (MailException e) { log.error("send mail failed, {} \n", e.getMessage()); + e.printStackTrace(); throw new ServerException(e.getMessage()); } } @@ -195,9 +196,11 @@ public void sendHtmlEmail(String from, String nickName, String subject, String[] log.info("Send mail success, in {} million seconds", System.currentTimeMillis() - startTimestamp); } catch (MessagingException e) { log.error("Send mail failed, {}\n", e.getMessage()); + e.printStackTrace(); throw new ServerException(e.getMessage()); } catch (UnsupportedEncodingException e) { log.error("Send mail failed, {}\n", e.getMessage()); + e.printStackTrace(); } } @@ -340,9 +343,11 @@ public void sendTemplateEmail(String from, String nickName, String subject, Stri log.info("Send mail success, in {} million seconds", System.currentTimeMillis() - startTimestamp); } catch (MessagingException e) { log.error("Send mail failed, {}\n", e.getMessage()); + e.printStackTrace(); throw new ServerException(e.getMessage()); } catch (UnsupportedEncodingException e) { log.error("Send mail failed, {}\n", e.getMessage()); + e.printStackTrace(); } } diff --git a/server/src/main/java/edp/core/utils/SourceUtils.java b/server/src/main/java/edp/core/utils/SourceUtils.java index 0d5456794..6cac2fd0d 100644 --- a/server/src/main/java/edp/core/utils/SourceUtils.java +++ b/server/src/main/java/edp/core/utils/SourceUtils.java @@ -21,7 +21,6 @@ import com.alibaba.druid.util.StringUtils; import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; import edp.core.common.jdbc.ESDataSource; import edp.core.common.jdbc.ExtendedJdbcClassLoader; import edp.core.common.jdbc.JdbcDataSource; @@ -120,6 +119,7 @@ void releaseConnection(Connection connection) { connection.close(); connection = null; } catch (Exception e) { + e.printStackTrace(); log.error("connection close error", e.getMessage()); } } @@ -132,7 +132,7 @@ public static void closeResult(ResultSet rs) { rs.close(); rs = null; } catch (Exception e) { - log.error("Failed to close result set: ", e); + e.printStackTrace(); } } } @@ -191,9 +191,6 @@ public static String getDataSourceName(String jdbcUrl) { if(CommonConfig.HIVE_DATASOURCE_URL().getValue().equals(jdbcUrl)){ return CommonConfig.HIVE_DATASOURCE_NAME().getValue(); } - if(VisualisUtils.PRESTO().getValue().equalsIgnoreCase(jdbcUrl)){ - return jdbcUrl; - } String dataSourceName = null; jdbcUrl = jdbcUrl.replaceAll(NEW_LINE_CHAR, EMPTY).replaceAll(SPACE, EMPTY).trim().toLowerCase(); Matcher matcher = PATTERN_JDBC_TYPE.matcher(jdbcUrl); diff --git a/server/src/main/java/edp/core/utils/SqlUtils.java b/server/src/main/java/edp/core/utils/SqlUtils.java index b21fcfeb1..bf0ff6ec6 100644 --- a/server/src/main/java/edp/core/utils/SqlUtils.java +++ b/server/src/main/java/edp/core/utils/SqlUtils.java @@ -20,8 +20,6 @@ package edp.core.utils; import com.alibaba.druid.util.StringUtils; -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; -import com.webank.wedatasphere.dss.visualis.model.PaginateWithExecStatus; import edp.core.common.jdbc.JdbcDataSource; import edp.core.consts.Consts; import edp.core.enums.DataTypeEnum; @@ -38,17 +36,23 @@ import net.sf.jsqlparser.parser.CCJSqlParserUtil; import net.sf.jsqlparser.schema.Table; import net.sf.jsqlparser.statement.select.*; + import org.apache.commons.lang.NotImplementedException; import org.joda.time.DateTime; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; + import org.springframework.beans.factory.annotation.Value; + import org.springframework.cache.annotation.CachePut; import org.springframework.cache.annotation.Cacheable; +import org.springframework.context.annotation.Scope; import org.springframework.jdbc.core.JdbcTemplate; +import org.springframework.stereotype.Component; import javax.sql.DataSource; +import java.io.StringReader; import java.math.BigDecimal; import java.sql.*; import java.util.*; @@ -106,7 +110,7 @@ public SqlUtils init(BaseSource source) { .SqlUtils() .withJdbcUrl(source.getJdbcUrl()) .withUsername(source.getUsername()) - .withPassword(decryptPassword(source.getJdbcUrl(), source.getPassword())) + .withPassword(source.getPassword()) .withDbVersion(source.getDbVersion()) .withIsExt(source.isExt()) .withJdbcDataSource(this.jdbcDataSource) @@ -120,7 +124,7 @@ public SqlUtils init(String jdbcUrl, String username, String password, String db .SqlUtils() .withJdbcUrl(jdbcUrl) .withUsername(username) - .withPassword(decryptPassword(jdbcUrl, password)) + .withPassword(password) .withDbVersion(dbVersion) .withIsExt(ext) .withJdbcDataSource(this.jdbcDataSource) @@ -129,22 +133,6 @@ public SqlUtils init(String jdbcUrl, String username, String password, String db .build(); } - private String decryptPassword(String jdbcUrl, String password) { - if(jdbcUrl.contains(CommonConfig.JDBC_ENCRYPT_PARAMETER().getValue())){ - String decryptedPassword = ""; - String[] passwordPrivateKey = org.apache.commons.lang.StringUtils.split(password, "@"); - try { -// decryptedPassword = EncryptUtil.decrypt(passwordPrivateKey[1], passwordPrivateKey[0]); - } catch (Exception e) { - log.error("failed to decrypt password for {" + password + "}", e); - throw new ServerException("failed to decrypt password", e); - } - return decryptedPassword; - } else { - return password; - } - } - public void execute(String sql) throws ServerException { sql = filterAnnotate(sql); checkSensitiveSql(sql); @@ -154,7 +142,7 @@ public void execute(String sql) throws ServerException { try { jdbcTemplate().execute(sql); } catch (Exception e) { - log.error("SQL execution error: ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } } @@ -179,41 +167,6 @@ public PaginateWithQueryColumns syncQuery4Paginate(String sql, Integer pageNo, I return paginate; } - public PaginateWithExecStatus asyncQuery4Exec(String sql, Integer pageNo, Integer pageSize, Integer totalCount, Integer limit, Set excludeColumns) throws Exception { - if (null == pageNo || pageNo < 1) { - pageNo = 0; - } - if (null == pageSize || pageSize < 1) { - pageSize = 0; - } - if (null == totalCount || totalCount < 1) { - totalCount = 0; - } - - if (null == limit) { - limit = -1; - } - - PaginateWithExecStatus paginate = submit4Exec(sql, pageNo, pageSize, totalCount, limit, excludeColumns); - return paginate; - } - - public PaginateWithExecStatus submit4Exec(String sql, int pageNo, int pageSize, int totalCount, int limit, Set excludeColumns) throws Exception { - throw new NotImplementedException(""); - } - - public PaginateWithExecStatus getProgress4Exec(String execId, String user) throws Exception { - throw new NotImplementedException(""); - } - - public PaginateWithExecStatus kill4Exec(String execId, String user) throws Exception { - throw new NotImplementedException(""); - } - - public PaginateWithExecStatus getResultSet4Exec(String execId, String user) throws Exception { - throw new NotImplementedException(""); - } - @CachePut(value = "query", key = "#sql") public List> query4List(String sql, int limit) throws Exception { sql = filterAnnotate(sql); @@ -467,7 +420,7 @@ public List getDatabases() throws SourceException { } } catch (Exception e) { - log.error(e.getMessage()); + e.printStackTrace(); throw new SourceException(e.getMessage() + ", jdbcUrl=" + this.jdbcUrl); } finally { sourceUtils.releaseConnection(connection); @@ -514,7 +467,7 @@ public List getTableList(String dbName) throws SourceException { tables.close(); } } catch (Exception e) { - log.error(e.getMessage()); + e.printStackTrace(); throw new SourceException(e.getMessage() + ", jdbcUrl=" + this.jdbcUrl); } finally { sourceUtils.releaseConnection(connection); @@ -566,7 +519,7 @@ public TableInfo getTableInfo(String dbName, String tableName) throws SourceExce tableInfo = new TableInfo(tableName, primaryKeys, columns); } } catch (SQLException e) { - log.error("Failed to get table column: ", e); + e.printStackTrace(); throw new SourceException(e.getMessage() + ", jdbcUrl=" + this.jdbcUrl); } finally { sourceUtils.releaseConnection(connection); @@ -837,12 +790,12 @@ public void executeBatch(String sql, Set headers, List headers, List doFilterSources(List sources) { - return null; - } - - public List doFilterViews(List views) { - String defaultVersion = "v1_v000000"; - HashMap> viewMap = new HashMap<>(); - for (ViewBaseInfo view : views) { - String shortName = getShortName(view.getName()); - String version = getSuffixVersion(view.getName()); - if (null == version) { - version = defaultVersion; - } - if (viewMap.containsKey(shortName)) { - Tuple currentItem = viewMap.get(shortName); - String currentVersion = currentItem.first; - if (currentVersion.compareTo(version) <= 0) { - currentItem.setFirst(version); - currentItem.setSecond(view); - } - } else { - Tuple item = new Tuple<>(version, view); - viewMap.put(shortName, item); - } - } - List viewList = new ArrayList<>(); - for (Map.Entry> item : viewMap.entrySet()) { - ViewBaseInfo selectView = item.getValue().getSecond(); - selectView.setName(item.getKey()); - viewList.add(selectView); - } - return viewList; - } - - public List doFilterWidgets(List widgets) { - String defaultVersion = "v1_v000000"; - HashMap> widgetMap = new HashMap<>(); - for (Widget widget : widgets) { - String shortName = getShortName(widget.getName()); - String version = getSuffixVersion(widget.getName()); - if (null == version) { - version = defaultVersion; - } - if (widgetMap.containsKey(shortName)) { - Tuple currentItem = widgetMap.get(shortName); - String currentVersion = currentItem.first; - if (currentVersion.compareTo(version) <= 0) { - currentItem.setFirst(version); - currentItem.setSecond(widget); - } - } else { - Tuple item = new Tuple<>(version, widget); - widgetMap.put(shortName, item); - } - } - List widgetList = new ArrayList<>(); - for (Map.Entry> item : widgetMap.entrySet()) { - Widget selectWidget = item.getValue().getSecond(); - selectWidget.setName(item.getKey()); - widgetList.add(selectWidget); - } - return widgetList; - } - - public List doFilterDisplays(List displays) { - String defaultVersion = "v1_v000000"; - HashMap> displayMap = new HashMap<>(); - for (Display display : displays) { - String shortName = getShortName(display.getName()); - String version = getSuffixVersion(display.getName()); - if (null == version) { - version = defaultVersion; - } - if (displayMap.containsKey(shortName)) { - Tuple currentItem = displayMap.get(shortName); - String currentVersion = currentItem.first; - if (currentVersion.compareTo(version) <= 0) { - currentItem.setFirst(version); - currentItem.setSecond(display); - } - } else { - Tuple item = new Tuple<>(version, display); - displayMap.put(shortName, item); - } - } - List displayList = new ArrayList<>(); - for (Map.Entry> item : displayMap.entrySet()) { - Display selectDisplay = item.getValue().getSecond(); - selectDisplay.setName(item.getKey()); - displayList.add(selectDisplay); - } - return displayList; - } - - public List doFilterDashboardPortal(List dashboardPortals) { - String defaultVersion = "v1_v000000"; - HashMap> dashboardPortalMap = new HashMap<>(); - for (DashboardPortal dashboardPortal : dashboardPortals) { - String shortName = getShortName(dashboardPortal.getName()); - String version = getSuffixVersion(dashboardPortal.getName()); - if (null == version) { - version = defaultVersion; - } - if (dashboardPortalMap.containsKey(shortName)) { - Tuple currentItem = dashboardPortalMap.get(shortName); - String currentVersion = currentItem.first; - if (currentVersion.compareTo(version) <= 0) { - currentItem.setFirst(version); - currentItem.setSecond(dashboardPortal); - } - } else { - Tuple item = new Tuple<>(version, dashboardPortal); - dashboardPortalMap.put(shortName, item); - } - } - List dashboardPortalList = new ArrayList<>(); - for (Map.Entry> item : dashboardPortalMap.entrySet()) { - DashboardPortal selectDashboardPortal = item.getValue().getSecond(); - selectDashboardPortal.setName(item.getKey()); - dashboardPortalList.add(selectDashboardPortal); - } - return dashboardPortalList; - } - - private String getShortName(String longName) { - String shortName; - String version = getSuffixVersion(longName); - if (null == version) { - shortName = longName; - } else { - shortName = longName.substring(0, longName.length() - version.length() - 1); - } - log.info("Get component name: " + shortName); - return shortName; - } - - private String getSuffixVersion(String longName) { - String version; - Matcher matcherVersionPattern = suffixVersionPattern.matcher(longName); - if (matcherVersionPattern.find()) { - version = matcherVersionPattern.group(); - log.info("Get component version: " + version); - } else { - version = null; - log.info("The component does not match the version."); - } - return version; - } - - class Tuple { - public A first; - public B second; - - public Tuple(A a, B b) { - first = a; - second = b; - } - - public A getFirst() { - return first; - } - - public void setFirst(A first) { - this.first = first; - } - - public B getSecond() { - return second; - } - - public void setSecond(B second) { - this.second = second; - } - - public String toString() { - return "(" + first + ", " + second + ")"; - } - } -} diff --git a/server/src/main/java/edp/davinci/common/utils/ScriptUtils.java b/server/src/main/java/edp/davinci/common/utils/ScriptUtiils.java similarity index 86% rename from server/src/main/java/edp/davinci/common/utils/ScriptUtils.java rename to server/src/main/java/edp/davinci/common/utils/ScriptUtiils.java index 44f7fdea2..5783aeeb8 100644 --- a/server/src/main/java/edp/davinci/common/utils/ScriptUtils.java +++ b/server/src/main/java/edp/davinci/common/utils/ScriptUtiils.java @@ -20,11 +20,7 @@ package edp.davinci.common.utils; import com.alibaba.druid.util.StringUtils; -import com.alibaba.fastjson.JSONObject; import com.fasterxml.jackson.databind.ObjectMapper; -import com.google.common.collect.Lists; -import com.google.gson.JsonObject; -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; import edp.davinci.core.common.Constants; import edp.davinci.core.enums.FieldFormatTypeEnum; import edp.davinci.core.enums.NumericUnitEnum; @@ -34,7 +30,6 @@ import edp.davinci.dto.viewDto.Param; import edp.davinci.dto.viewDto.ViewExecuteParam; import jdk.nashorn.api.scripting.ScriptObjectMirror; -import lombok.extern.slf4j.Slf4j; import org.springframework.stereotype.Component; import javax.script.Invocable; @@ -44,15 +39,16 @@ import java.io.InputStreamReader; import java.lang.reflect.InvocationTargetException; import java.lang.reflect.Method; -import java.util.*; +import java.util.ArrayList; +import java.util.Collection; +import java.util.List; import static edp.core.consts.Consts.EMPTY; import static edp.davinci.core.common.Constants.EXCEL_FORMAT_TYPE_KEY; -@Slf4j @Component -public class ScriptUtils { - private static ClassLoader classLoader = ScriptUtils.class.getClassLoader(); +public class ScriptUtiils { + private static ClassLoader classLoader = ScriptUtiils.class.getClassLoader(); public static ScriptEngine getCellValueScriptEngine() throws Exception { @@ -74,11 +70,10 @@ public static ViewExecuteParam getViewExecuteParam(ScriptEngine engine, String d if (obj instanceof ScriptObjectMirror) { ScriptObjectMirror vsom = (ScriptObjectMirror) obj; - Set groups = new HashSet<>(); + List groups = new ArrayList<>(); List aggregators = new ArrayList<>(); List orders = new ArrayList<>(); List filters = new ArrayList<>(); - VirtualView virtualView = null; Boolean cache = false; Boolean nativeQuery = false; @@ -90,7 +85,7 @@ public static ViewExecuteParam getViewExecuteParam(ScriptEngine engine, String d ScriptObjectMirror groupMirror = (ScriptObjectMirror) vsom.get(key); if (groupMirror.isArray()) { Collection values = groupMirror.values(); - values.forEach(v -> groups.add(getRealColumn(String.valueOf(v)))); + values.forEach(v -> groups.add(String.valueOf(v))); } break; case "aggregators": @@ -99,7 +94,7 @@ public static ViewExecuteParam getViewExecuteParam(ScriptEngine engine, String d Collection values = aggregatorsMirror.values(); values.forEach(v -> { ScriptObjectMirror agg = (ScriptObjectMirror) v; - Aggregator aggregator = new Aggregator(getRealColumn(String.valueOf(agg.get("column"))), String.valueOf(agg.get("func"))); + Aggregator aggregator = new Aggregator(String.valueOf(agg.get("column")), String.valueOf(agg.get("func"))); aggregators.add(aggregator); }); } @@ -110,7 +105,7 @@ public static ViewExecuteParam getViewExecuteParam(ScriptEngine engine, String d Collection values = ordersMirror.values(); values.forEach(v -> { ScriptObjectMirror agg = (ScriptObjectMirror) v; - Order order = new Order(getRealColumn(String.valueOf(agg.get("column"))), String.valueOf(agg.get("direction"))); + Order order = new Order(String.valueOf(agg.get("column")), String.valueOf(agg.get("direction"))); orders.add(order); }); } @@ -127,7 +122,7 @@ public static ViewExecuteParam getViewExecuteParam(ScriptEngine engine, String d if (filterMirror.isArray() && filterMirror.size() > 0) { Collection values = filterMirror.values(); values.forEach(v -> { - filters.add(getRealColumn(String.valueOf(v))); + filters.add(String.valueOf(v)); }); } } @@ -144,42 +139,28 @@ public static ViewExecuteParam getViewExecuteParam(ScriptEngine engine, String d Collection values = paramsMirror.values(); values.forEach(v -> { ScriptObjectMirror agg = (ScriptObjectMirror) v; - Param param = new Param(getRealColumn(String.valueOf(agg.get("name"))), String.valueOf(agg.get("value"))); + Param param = new Param(String.valueOf(agg.get("name")), String.valueOf(agg.get("value"))); params.add(param); }); } break; case "nativeQuery": nativeQuery = (Boolean) vsom.get(key); - nativeQuery = nativeQuery == null ? false : nativeQuery; - break; - case "view": - if (vsom.get(key) instanceof Integer) { - break; - } - virtualView = JSONObject.parseObject(JSONObject.toJSONString(vsom.get(key)), VirtualView.class); break; } } - return new ViewExecuteParam(Lists.newArrayList(groups), aggregators, orders, filters, virtualView, params, cache, expired, nativeQuery); + return new ViewExecuteParam(groups, aggregators, orders, filters, params, cache, expired, nativeQuery); } } catch (ScriptException e) { - log.error(e.getMessage()); + e.printStackTrace(); } catch (NoSuchMethodException e) { - log.error(e.getMessage()); + e.printStackTrace(); } return null; } - private static String getRealColumn(String columnName) { - if (columnName.contains("@Visualis@")) { - columnName = org.apache.commons.lang.StringUtils.substringBefore(columnName, "@Visualis@"); - } - return columnName; - } - public static List formatHeader(ScriptEngine engine, String json, List params) { try { @@ -278,11 +259,11 @@ public static List formatHeader(ScriptEngine engine, String json, L method.invoke(header, vsom.get(key)); } } catch (NoSuchMethodException e) { - log.error(e.getMessage()); + e.printStackTrace(); } catch (IllegalAccessException e) { - log.error(e.getMessage()); + e.printStackTrace(); } catch (InvocationTargetException e) { - log.error(e.getMessage()); + e.printStackTrace(); } finally { continue; } @@ -295,9 +276,9 @@ public static List formatHeader(ScriptEngine engine, String json, L } } catch (ScriptException e) { - log.error(e.getMessage()); + e.printStackTrace(); } catch (NoSuchMethodException e) { - log.error(e.getMessage()); + e.printStackTrace(); } return null; } diff --git a/server/src/main/java/edp/davinci/controller/CheckController.java b/server/src/main/java/edp/davinci/controller/CheckController.java index a32226f9b..9572bc046 100644 --- a/server/src/main/java/edp/davinci/controller/CheckController.java +++ b/server/src/main/java/edp/davinci/controller/CheckController.java @@ -21,16 +21,18 @@ import edp.core.annotation.AuthIgnore; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.enums.HttpCodeEnum; import edp.core.utils.TokenUtils; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; import edp.davinci.core.enums.CheckEntityEnum; -import edp.davinci.model.Project; import edp.davinci.model.User; import edp.davinci.service.CheckService; import edp.davinci.service.ProjectService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; @@ -39,12 +41,15 @@ import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestParam; import org.springframework.web.bind.annotation.RestController; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; +@Api(value = "/check", tags = "check", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "sources not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/check", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/check", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class CheckController { @Autowired @@ -62,7 +67,7 @@ public class CheckController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique username") @AuthIgnore @GetMapping("/user") public ResponseEntity checkUser(@RequestParam String username, @@ -72,7 +77,8 @@ public ResponseEntity checkUser(@RequestParam String username, ResultMap resultMap = checkService.checkSource(username, id, CheckEntityEnum.USER, null, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check user error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -83,8 +89,8 @@ public ResponseEntity checkUser(@RequestParam String username, * @param request * @return */ - @MethodLog - @GetMapping("/organization" ) + @ApiOperation(value = "check unique organization name") + @GetMapping("/organization") public ResponseEntity checkOrganization(@RequestParam String name, @RequestParam(required = false) Long id, HttpServletRequest request) { @@ -92,7 +98,8 @@ public ResponseEntity checkOrganization(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.ORGANIZATION, null, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check organization error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -103,15 +110,15 @@ public ResponseEntity checkOrganization(@RequestParam String name, * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique project name") @GetMapping("/project") - public ResponseEntity checkProject(@CurrentUser User user, + public ResponseEntity checkProject(@ApiIgnore @CurrentUser User user, @RequestParam String name, @RequestParam(required = false) Long id, @RequestParam(required = false) Long orgId, HttpServletRequest request) { try { ResultMap resultMap = new ResultMap(tokenUtils); - if (projectService.isExist(name, id, orgId, user.getId())) { + if(projectService.isExist(name, id, orgId, user.getId())){ resultMap = resultMap.failAndRefreshToken(request) .message("the current project name is already taken"); } else { @@ -119,7 +126,8 @@ public ResponseEntity checkProject(@CurrentUser User user, } return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check project error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -131,7 +139,7 @@ public ResponseEntity checkProject(@CurrentUser User user, * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique display name") @GetMapping("/display") public ResponseEntity checkDisplay(@RequestParam String name, @RequestParam(required = false) Long id, @@ -140,7 +148,8 @@ public ResponseEntity checkDisplay(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.DISPLAY, projectId, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check display error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -151,7 +160,7 @@ public ResponseEntity checkDisplay(@RequestParam String name, * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique source name") @GetMapping("/source") public ResponseEntity checkSource(@RequestParam String name, @RequestParam(required = false) Long id, @@ -160,7 +169,8 @@ public ResponseEntity checkSource(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.SOURCE, projectId, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check source error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -171,7 +181,7 @@ public ResponseEntity checkSource(@RequestParam String name, * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique view name") @GetMapping("/view") public ResponseEntity checkView(@RequestParam String name, @RequestParam(required = false) Long id, @@ -180,7 +190,8 @@ public ResponseEntity checkView(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.VIEW, projectId, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check view error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -192,7 +203,7 @@ public ResponseEntity checkView(@RequestParam String name, * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique widget name") @GetMapping("/widget") public ResponseEntity checkWidget(@RequestParam String name, @RequestParam(required = false) Long id, @@ -201,7 +212,8 @@ public ResponseEntity checkWidget(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.WIDGET, projectId, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check widget error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -212,7 +224,7 @@ public ResponseEntity checkWidget(@RequestParam String name, * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique dashboard name") @GetMapping("/dashboardPortal") public ResponseEntity checkDashboardPortal(@RequestParam String name, @RequestParam(required = false) Long id, @@ -221,7 +233,8 @@ public ResponseEntity checkDashboardPortal(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.DASHBOARDPORTAL, projectId, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check dashboardPortal error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -232,7 +245,7 @@ public ResponseEntity checkDashboardPortal(@RequestParam String name, * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique dashboard name") @GetMapping("/dashboard") public ResponseEntity checkDashboard(@RequestParam String name, @RequestParam(required = false) Long id, @@ -241,36 +254,20 @@ public ResponseEntity checkDashboard(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.DASHBOARD, portal, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check dashboard error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } - /** - * 检查工程名是否存在 - * - * @param keywords - * @param request - * @return - */ - @MethodLog - @GetMapping("/projectName") - public ResponseEntity checkProjectName(@RequestParam(value = "keywords") String keywords, - HttpServletRequest request) { - - Project project = projectService.checkProjectName(keywords); - return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(project)); - } - - /** * 检查cronjob是否存在 * * @param request * @return */ - @MethodLog + @ApiOperation(value = "check unique dashboard name") @GetMapping("/cronjob") public ResponseEntity checkCronJob(@RequestParam String name, @RequestParam(required = false) Long id, @@ -279,7 +276,8 @@ public ResponseEntity checkCronJob(@RequestParam String name, ResultMap resultMap = checkService.checkSource(name, id, CheckEntityEnum.CRONJOB, projectId, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("check cronjob error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } diff --git a/server/src/main/java/edp/davinci/controller/CronJobController.java b/server/src/main/java/edp/davinci/controller/CronJobController.java index 01c8f14e3..596c04329 100755 --- a/server/src/main/java/edp/davinci/controller/CronJobController.java +++ b/server/src/main/java/edp/davinci/controller/CronJobController.java @@ -21,7 +21,6 @@ import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -31,20 +30,27 @@ import edp.davinci.model.CronJob; import edp.davinci.model.User; import edp.davinci.service.CronJobService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.List; +@Api(value = "/cronjobs", tags = "cronjobs", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "cronjob not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/cronjobs", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/cronjobs", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class CronJobController extends BaseController { @Autowired @@ -58,10 +64,10 @@ public class CronJobController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "get jobs") @GetMapping public ResponseEntity getCronJobs(@RequestParam Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(projectId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -81,11 +87,11 @@ public ResponseEntity getCronJobs(@RequestParam Long projectId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "create job") @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createCronJob(@Valid @RequestBody CronJobBaseInfo cronJob, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -108,12 +114,12 @@ public ResponseEntity createCronJob(@Valid @RequestBody CronJobBaseInfo cronJob, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update job") @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateCronJob(@PathVariable Long id, @Valid @RequestBody CronJobUpdate cronJob, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -138,10 +144,10 @@ public ResponseEntity updateCronJob(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete job") @DeleteMapping("/{id}") public ResponseEntity deleteCronJob(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -162,10 +168,10 @@ public ResponseEntity deleteCronJob(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "start job") @PostMapping("/start/{id}") public ResponseEntity startCronJob(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -177,10 +183,11 @@ public ResponseEntity startCronJob(@PathVariable Long id, return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(cronJob)); } - @MethodLog + + @ApiOperation(value = "stop job") @PostMapping("/stop/{id}") public ResponseEntity stopCronJob(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { diff --git a/server/src/main/java/edp/davinci/controller/DashboardController.java b/server/src/main/java/edp/davinci/controller/DashboardController.java index 1d3d34505..47cf428e8 100644 --- a/server/src/main/java/edp/davinci/controller/DashboardController.java +++ b/server/src/main/java/edp/davinci/controller/DashboardController.java @@ -20,9 +20,7 @@ package edp.davinci.controller; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -33,21 +31,27 @@ import edp.davinci.model.User; import edp.davinci.service.DashboardPortalService; import edp.davinci.service.DashboardService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.List; +@Api(value = "/dashboardPortals", tags = "dashboardPortals", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "dashboardPortal not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/dashboardPortals", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/dashboardPortals", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class DashboardController extends BaseController { @Autowired @@ -56,9 +60,6 @@ public class DashboardController extends BaseController { @Autowired private DashboardService dashboardService; - @Autowired - private ProjectAuth projectAuth; - /** * 获取dashboardPortal列表 * @@ -67,10 +68,10 @@ public class DashboardController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "get dashboardPortals") @GetMapping public ResponseEntity getDashboardPortals(@RequestParam Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(projectId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -89,10 +90,10 @@ public ResponseEntity getDashboardPortals(@RequestParam Long projectId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get dashboards") @GetMapping("/{id}/dashboards") public ResponseEntity getDashboards(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid id"); @@ -111,7 +112,7 @@ public ResponseEntity getDashboards(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get dashboard exclude roles") @GetMapping("/dashboard/{id}/exclude/roles") public ResponseEntity getDashboardExcludeRoles(@PathVariable Long id, HttpServletRequest request) { @@ -132,9 +133,10 @@ public ResponseEntity getDashboardExcludeRoles(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get dashboard portal exclude roles") @GetMapping("/{id}/exclude/roles") public ResponseEntity getPortalExcludeRoles(@PathVariable Long id, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid id"); @@ -154,11 +156,11 @@ public ResponseEntity getPortalExcludeRoles(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get dashboard widgets") @GetMapping("/{portalId}/dashboards/{dashboardId}") public ResponseEntity getDashboardMemWidgets(@PathVariable("portalId") Long portalId, @PathVariable("dashboardId") Long dashboardId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(portalId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid dashboard portal id"); @@ -184,11 +186,11 @@ public ResponseEntity getDashboardMemWidgets(@PathVariable("portalId") Long port * @param request * @return */ - @MethodLog + @ApiOperation(value = "create dashboard portal") @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createDashboardPortal(@Valid @RequestBody DashboardPortalCreate dashboardPortal, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -196,10 +198,6 @@ public ResponseEntity createDashboardPortal(@Valid @RequestBody DashboardPortalC return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - if(!projectAuth.isPorjectOwner(dashboardPortal.getProjectId(), user.getId())) { - return ResponseEntity.status(HttpStatus.UNAUTHORIZED).build(); - } - DashboardPortal portal = dashboardPortalService.createDashboardPortal(dashboardPortal, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(portal)); } @@ -215,12 +213,12 @@ public ResponseEntity createDashboardPortal(@Valid @RequestBody DashboardPortalC * @param request * @return */ - @MethodLog + @ApiOperation(value = "update dashboard portal") @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateDashboardPortal(@PathVariable Long id, @Valid @RequestBody DashboardPortalUpdate dashboardPortalUpdate, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -246,10 +244,10 @@ public ResponseEntity updateDashboardPortal(@PathVariable Long id, * @param request * @return */ - @MethodLog - @PostMapping("/{id}") + @ApiOperation(value = "delete dashboard portal") + @DeleteMapping("/{id}") public ResponseEntity deleteDashboardPortal(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -272,12 +270,12 @@ public ResponseEntity deleteDashboardPortal(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "create dashboard") @PostMapping(value = "/{id}/dashboards", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createDashboard(@PathVariable("id") Long portalId, @Valid @RequestBody DashboardCreate dashboardCreate, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -304,12 +302,12 @@ public ResponseEntity createDashboard(@PathVariable("id") Long portalId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update dashboards") @PutMapping(value = "{id}/dashboards", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateDashboards(@PathVariable("id") Long portalId, @Valid @RequestBody DashboardDto[] dashboards, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -337,10 +335,10 @@ public ResponseEntity updateDashboards(@PathVariable("id") Long portalId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete dashboard") @DeleteMapping("/dashboards/{dashboardId}") public ResponseEntity deleteDashboard(@PathVariable Long dashboardId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(dashboardId)) { @@ -364,13 +362,13 @@ public ResponseEntity deleteDashboard(@PathVariable Long dashboardId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "create dashboard widget relation") @PostMapping(value = "/{portalId}/dashboards/{dashboardId}/widgets", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createMemDashboardWidget(@PathVariable("portalId") Long portalId, @PathVariable("dashboardId") Long dashboardId, @Valid @RequestBody MemDashboardWidgetCreate[] memDashboardWidgetCreates, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(portalId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid dashboard portal id"); @@ -407,12 +405,12 @@ public ResponseEntity createMemDashboardWidget(@PathVariable("portalId") Long po * @param request * @return */ - @MethodLog + @ApiOperation(value = "update dashboard widget relation") @PutMapping(value = "/{portalId}/dashboards/widgets", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateMemDashboardWidget(@PathVariable("portalId") Long portalId, @Valid @RequestBody MemDashboardWidgetDto[] memDashboardWidgets, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); @@ -454,10 +452,10 @@ public ResponseEntity updateMemDashboardWidget(@PathVariable("portalId") Long po * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete dashboard widget relation") @DeleteMapping(value = "/dashboards/widgets/{relationId}") public ResponseEntity deleteMemDashboardWidget(@PathVariable Long relationId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { dashboardService.deleteMemDashboardWidget(relationId, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request)); @@ -473,11 +471,11 @@ public ResponseEntity deleteMemDashboardWidget(@PathVariable Long relationId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "share dashboard") @GetMapping("/dashboards/{dashboardId}/share") public ResponseEntity shareDashboard(@PathVariable Long dashboardId, @RequestParam(required = false) String username, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(dashboardId)) { diff --git a/server/src/main/java/edp/davinci/controller/DashboardPreviewController.java b/server/src/main/java/edp/davinci/controller/DashboardPreviewController.java index e658c28bb..3c9a575e5 100644 --- a/server/src/main/java/edp/davinci/controller/DashboardPreviewController.java +++ b/server/src/main/java/edp/davinci/controller/DashboardPreviewController.java @@ -1,35 +1,37 @@ package edp.davinci.controller; import com.google.common.collect.Iterables; -import com.google.common.collect.Lists; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.common.job.ScheduleService; import edp.davinci.core.common.Constants; import edp.davinci.dao.DashboardMapper; import edp.davinci.dto.dashboardDto.DashboardWithPortal; -import edp.davinci.model.Dashboard; import edp.davinci.model.User; import edp.davinci.service.screenshot.ImageContent; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.apache.commons.io.IOUtils; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Value; import org.springframework.http.MediaType; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; -import javax.imageio.ImageIO; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; -import java.awt.image.BufferedImage; import java.io.File; import java.io.FileInputStream; import java.io.IOException; import java.util.List; +@Api(value = "/dashboard", tags = "dashboard", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "dashboard not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/dashboard", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/dashboard", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class DashboardPreviewController { @Autowired @@ -42,93 +44,37 @@ public class DashboardPreviewController { @Value("${file.userfiles-path}") private String fileBasePath; - @MethodLog + @ApiOperation(value = "preview dashboard") @GetMapping(value = "/{id}/preview", produces = MediaType.IMAGE_PNG_VALUE) @ResponseBody public void previewDisplay(@PathVariable Long id, @RequestParam(required = false) String username, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request, HttpServletResponse response) throws IOException { DashboardWithPortal dashboardWithPortalAndProject = dashboardMapper.getDashboardWithPortalAndProject(id); + if(!user.getId().equals(dashboardWithPortalAndProject.getProject().getUserId())){ + response.setContentType(MediaType.TEXT_PLAIN_VALUE); + response.getWriter().write("You have no access to this dashboard."); + return; + } FileInputStream inputStream = null; try { List imageFiles = scheduleService.getPreviewImage(user.getId(), "dashboard", id); File imageFile = Iterables.getFirst(imageFiles, null).getImageFile(); - if(null != imageFile) { - inputStream = new FileInputStream(imageFile); - response.setContentType(MediaType.IMAGE_PNG_VALUE); - IOUtils.copy(inputStream, response.getOutputStream()); - } else { - log.error("Execute display failed, because image file is null."); - response.sendError(504, "Execute display failed, because image file is null."); - } + inputStream = new FileInputStream(imageFile); + response.setContentType(MediaType.IMAGE_PNG_VALUE); + IOUtils.copy(inputStream, response.getOutputStream()); + } catch (IOException e) { + e.printStackTrace(); + log.error(e.getMessage()); } catch (Exception e) { - log.error("dashboard preview error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); } finally { - if(null != inputStream) { - inputStream.close(); - } - } - } - - @MethodLog - @GetMapping(value = "/portal/{id}/preview", produces = MediaType.IMAGE_PNG_VALUE) - @ResponseBody - public void previewPortal(@PathVariable Long id, - @RequestParam(required = false) String username, - @CurrentUser User user, - HttpServletRequest request, - HttpServletResponse response) throws Exception { - - List dashboards = dashboardMapper.getByPortalId(id); - List finalFiles = Lists.newArrayList(); - for(Dashboard dashboard : dashboards){ - List imageFiles = scheduleService.getPreviewImage(user.getId(), "dashboard", dashboard.getId()); - File imageFile = Iterables.getFirst(imageFiles, null).getImageFile(); - finalFiles.add(imageFile); - if(null == imageFile) { - log.error("{} reports an error when executing the dashboard: {}, and the picture is null", username, dashboard.getId()); - response.sendError(504, "Execute dashboard failed, because image file is null."); - return; - } - } - BufferedImage merged = mergeImage(finalFiles.toArray(new File[0])); - response.setContentType(MediaType.IMAGE_PNG_VALUE); - ImageIO.write(merged, "png", response.getOutputStream()); - } - - public static BufferedImage mergeImage(File[] src) throws IOException { - int len = src.length; - if(len == 1){ - return ImageIO.read(src[0]); - } - BufferedImage[] images = new BufferedImage[len]; - int[][] ImageArrays = new int[len][]; - for (int i = 0; i < len; i++) { - images[i] = ImageIO.read(src[i]); - int width = images[i].getWidth(); - int height = images[i].getHeight(); - ImageArrays[i] = new int[width * height]; - ImageArrays[i] = images[i].getRGB(0, 0, width, height, ImageArrays[i], 0, width); - } - int newHeight = 0; - int newWidth = 0; - for (int i = 0; i < images.length; i++) { - newWidth = newWidth > images[i].getWidth() ? newWidth : images[i].getWidth(); - newHeight += images[i].getHeight(); + inputStream.close(); } - - - BufferedImage ImageNew = new BufferedImage(newWidth, newHeight, BufferedImage.TYPE_INT_RGB); - int height_i = 0; - for (int i = 0; i < images.length; i++) { - ImageNew.setRGB(0, height_i, newWidth, images[i].getHeight(), ImageArrays[i], 0, newWidth); - height_i += images[i].getHeight(); - } - return ImageNew; } - } diff --git a/server/src/main/java/edp/davinci/controller/DisplayController.java b/server/src/main/java/edp/davinci/controller/DisplayController.java index 4a9893c75..2450a41a7 100644 --- a/server/src/main/java/edp/davinci/controller/DisplayController.java +++ b/server/src/main/java/edp/davinci/controller/DisplayController.java @@ -20,11 +20,8 @@ package edp.davinci.controller; import com.alibaba.druid.util.StringUtils; -import com.alibaba.fastjson.JSONArray; import com.google.common.collect.Iterables; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.common.job.ScheduleService; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; @@ -35,17 +32,22 @@ import edp.davinci.model.*; import edp.davinci.service.DisplayService; import edp.davinci.service.screenshot.ImageContent; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.apache.commons.io.IOUtils; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Value; -import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; import org.springframework.web.multipart.MultipartFile; +import springfox.documentation.annotations.ApiIgnore; +import javax.imageio.ImageIO; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import javax.validation.Valid; @@ -53,11 +55,12 @@ import java.io.FileInputStream; import java.io.IOException; import java.util.List; -import java.util.Map; +@Api(value = "/displays", tags = "displays", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "display not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/displays", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/displays", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class DisplayController extends BaseController { @Autowired @@ -69,16 +72,12 @@ public class DisplayController extends BaseController { //TODO not this layer, should be removed @Autowired DisplayMapper displayMapper; - @Autowired ProjectMapper projectMapper; @Value("${file.userfiles-path}") private String fileBasePath; - @Autowired - private ProjectAuth projectAuth; - /** * 新建display * @@ -88,22 +87,17 @@ public class DisplayController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "create new display", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createDisplay(@Valid @RequestBody DisplayInfo displayInfo, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - - if(!projectAuth.isPorjectOwner(displayInfo.getProjectId(), user.getId())) { - return ResponseEntity.status(HttpStatus.UNAUTHORIZED).build(); - } - Display display; if(displayInfo.getIsCopy()){ display = displayService.copyDisplay(displayInfo, user); @@ -123,11 +117,11 @@ public ResponseEntity createDisplay(@Valid @RequestBody DisplayInfo displayInfo, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update display info", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateDisplay(@Valid @RequestBody DisplayUpdate display, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, @PathVariable Long id, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -152,10 +146,10 @@ public ResponseEntity updateDisplay(@Valid @RequestBody DisplayUpdate display, * @param request * @return */ - @MethodLog - @PostMapping("/{id}") + @ApiOperation(value = "delete a display", consumes = MediaType.APPLICATION_JSON_VALUE) + @DeleteMapping("/{id}") public ResponseEntity deleteDisplay(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -179,12 +173,12 @@ public ResponseEntity deleteDisplay(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "create new display slide", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "/{id}/slides", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createDisplaySlide(@Valid @RequestBody DisplaySlideCreate displaySlideCreate, - BindingResult bindingResult, + @ApiIgnore BindingResult bindingResult, @PathVariable("id") Long displayId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -211,11 +205,11 @@ public ResponseEntity createDisplaySlide(@Valid @RequestBody DisplaySlideCreate * @param request * @return */ - @MethodLog + @ApiOperation(value = "update display slides info", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}/slides", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateDisplaySlide(@Valid @RequestBody DisplaySlide[] displaySlides, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, @PathVariable("id") Long displayId, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -246,10 +240,10 @@ public ResponseEntity updateDisplaySlide(@Valid @RequestBody DisplaySlide[] disp * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete display slide", consumes = MediaType.APPLICATION_JSON_VALUE) @DeleteMapping("/slides/{slideId}") public ResponseEntity deleteDisplaySlide(@PathVariable("slideId") Long slideId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(slideId)) { @@ -273,13 +267,13 @@ public ResponseEntity deleteDisplaySlide(@PathVariable("slideId") Long slideId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "add display slide widgets", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "/{displayId}/slides/{slideId}/widgets", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity addMemDisplaySlideWidgets(@PathVariable("displayId") Long displayId, @PathVariable("slideId") Long slideId, @Valid @RequestBody MemDisplaySlideWidgetCreate[] slideWidgetCreates, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(displayId)) { @@ -328,13 +322,13 @@ public ResponseEntity addMemDisplaySlideWidgets(@PathVariable("displayId") Long * @param request * @return */ - @MethodLog + @ApiOperation(value = "update display slide widgets", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{displayId}/slides/{slideId}/widgets", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateMemDisplaySlideWidgets(@PathVariable("displayId") Long displayId, @PathVariable("slideId") Long slideId, @Valid @RequestBody MemDisplaySlideWidgetDto[] memDisplaySlideWidgets, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(displayId)) { @@ -383,12 +377,12 @@ public ResponseEntity updateMemDisplaySlideWidgets(@PathVariable("displayId") Lo * @param request * @return */ - @MethodLog + @ApiOperation(value = "update display slide widget", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/slides/widgets/{relationId}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateMemDisplaySlideWidget(@PathVariable("relationId") Long relationId, @Valid @RequestBody MemDisplaySlideWidget memDisplaySlideWidget, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -414,10 +408,10 @@ public ResponseEntity updateMemDisplaySlideWidget(@PathVariable("relationId") Lo * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete display slide widget", consumes = MediaType.APPLICATION_JSON_VALUE) @DeleteMapping("/slides/widgets/{relationId}") public ResponseEntity deleteMemDisplaySlideWidget(@PathVariable("relationId") Long relationId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(relationId)) { @@ -437,10 +431,10 @@ public ResponseEntity deleteMemDisplaySlideWidget(@PathVariable("relationId") Lo * @param request * @return */ - @MethodLog + @ApiOperation(value = "get displays") @GetMapping public ResponseEntity getDisplays(@RequestParam Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(projectId)) { @@ -460,10 +454,10 @@ public ResponseEntity getDisplays(@RequestParam Long projectId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get display slides") @GetMapping("/{id}/slides") public ResponseEntity getDisplaySlide(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid Display id"); @@ -483,11 +477,11 @@ public ResponseEntity getDisplaySlide(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get display slide widgets") @GetMapping("/{displayId}/slides/{slideId}") public ResponseEntity getDisplaySlideWidgets(@PathVariable("displayId") Long displayId, @PathVariable("slideId") Long slideId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(displayId)) { @@ -514,17 +508,14 @@ public ResponseEntity getDisplaySlideWidgets(@PathVariable("displayId") Long dis * @param request * @return */ - // RequestBody: {"slides":["83"],"labels":{"route":"dev"}} - @MethodLog + @ApiOperation(value = "delete display slide widgets") @DeleteMapping("/{displayId}/slides/{slideId}/widgets") public ResponseEntity deleteDisplaySlideWeight(@PathVariable("displayId") Long displayId, @PathVariable("slideId") Long slideId, - @RequestBody Map param, - @CurrentUser User user, + @RequestBody Long[] ids, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { - Long[] ids = ((JSONArray) param.get("slides")).toJavaList(Long.class).toArray(new Long[]{}); - if (invalidId(displayId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid Display id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); @@ -552,7 +543,7 @@ public ResponseEntity deleteDisplaySlideWeight(@PathVariable("displayId") Long d * @param request * @return */ - @MethodLog + @ApiOperation(value = "upload avatar") @PostMapping(value = "/upload/coverImage") public ResponseEntity uploadAvatar(@RequestParam("coverImage") MultipartFile file, HttpServletRequest request) { @@ -577,11 +568,11 @@ public ResponseEntity uploadAvatar(@RequestParam("coverImage") MultipartFile fil * @param request * @return */ - @MethodLog + @ApiOperation(value = "upload avatar") @PostMapping(value = "/slide/{slideId}/upload/bgImage") public ResponseEntity uploadSlideBGImage(@PathVariable Long slideId, @RequestParam("backgroundImage") MultipartFile file, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(slideId)) { @@ -607,11 +598,11 @@ public ResponseEntity uploadSlideBGImage(@PathVariable Long slideId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "upload subwidget bgImage") @PostMapping(value = "/slide/widget/{relationId}/bgImage") public ResponseEntity uploadSlideSubWidgetBGImage(@PathVariable Long relationId, @RequestParam("backgroundImage") MultipartFile file, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(relationId)) { @@ -637,11 +628,11 @@ public ResponseEntity uploadSlideSubWidgetBGImage(@PathVariable Long relationId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "share display") @GetMapping("/{id}/share") public ResponseEntity shareDisplay(@PathVariable Long id, @RequestParam(required = false) String username, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid id"); @@ -660,7 +651,7 @@ public ResponseEntity shareDisplay(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get display exclude roles") @GetMapping("/{id}/exclude/roles") public ResponseEntity getDisplayExcludeRoles(@PathVariable Long id, HttpServletRequest request) { @@ -681,7 +672,7 @@ public ResponseEntity getDisplayExcludeRoles(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get display slide exclude roles") @GetMapping("/slide/{id}/exclude/roles") public ResponseEntity getSlideExcludeRoles(@PathVariable Long id, HttpServletRequest request) { @@ -694,35 +685,37 @@ public ResponseEntity getSlideExcludeRoles(@PathVariable Long id, return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(excludeRoles)); } - @MethodLog + @ApiOperation(value = "preview display") @GetMapping(value = "/{id}/preview", produces = MediaType.IMAGE_PNG_VALUE) @ResponseBody public void previewDisplay(@PathVariable Long id, @RequestParam(required = false) String username, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request, HttpServletResponse response) throws IOException { Display display = displayMapper.getById(id); Project project = projectMapper.getById(display.getProjectId()); + if(!user.getId().equals(project.getUserId())){ + response.setContentType(MediaType.TEXT_PLAIN_VALUE); + response.getWriter().write("You have no access to this display."); + return; + } FileInputStream inputStream = null; try { List imageFiles = scheduleService.getPreviewImage(user.getId(), "display", id); File imageFile = Iterables.getFirst(imageFiles, null).getImageFile(); - if(null != imageFile) { - inputStream = new FileInputStream(imageFile); - response.setContentType(MediaType.IMAGE_PNG_VALUE); - IOUtils.copy(inputStream, response.getOutputStream()); - } else { - log.error("Execute display failed, because image file is null."); - response.sendError(504, "Execute display failed, because image file is null."); - } + inputStream = new FileInputStream(imageFile); + response.setContentType(MediaType.IMAGE_PNG_VALUE); + IOUtils.copy(inputStream, response.getOutputStream()); + } catch (IOException e) { + e.printStackTrace(); + log.error(e.getMessage()); } catch (Exception e) { - log.error("display preview error: ", e); + e.printStackTrace(); + log.error(e.getMessage()); } finally { - if(null != inputStream) { - inputStream.close(); - } + inputStream.close(); } } } diff --git a/server/src/main/java/edp/davinci/controller/DownloadController.java b/server/src/main/java/edp/davinci/controller/DownloadController.java index 739083a3b..c3be2584a 100644 --- a/server/src/main/java/edp/davinci/controller/DownloadController.java +++ b/server/src/main/java/edp/davinci/controller/DownloadController.java @@ -22,7 +22,6 @@ import com.alibaba.druid.util.StringUtils; import edp.core.annotation.AuthIgnore; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -34,12 +33,17 @@ import edp.davinci.model.User; import edp.davinci.service.DownloadService; import edp.davinci.service.ShareDownloadService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.elasticsearch.common.io.Streams; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; @@ -58,6 +62,8 @@ * @Date 19/5/27 20:30 * To change this template use File | Settings | File Templates. */ +@Api(value = "/download", tags = "download", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "download not found")) @Slf4j @RestController @RequestMapping(value = Constants.BASE_API_PATH + "/download") @@ -69,15 +75,16 @@ public class DownloadController extends BaseController { @Autowired private ShareDownloadService shareDownloadService; - @GetMapping(value = "/page", produces = MediaType.APPLICATION_JSON_VALUE) - public ResponseEntity getDownloadRecordPage(@CurrentUser User user, + @ApiOperation(value = "get download record page") + @GetMapping(value = "/page", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) + public ResponseEntity getDownloadRecordPage(@ApiIgnore @CurrentUser User user, HttpServletRequest request) { List records = downloadService.queryDownloadRecordPage(user.getId()); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(records)); } - @MethodLog + @ApiOperation(value = "get download record file") @GetMapping(value = "/record/file/{id}/{token:.*}", produces = MediaType.APPLICATION_OCTET_STREAM_VALUE) @AuthIgnore public ResponseEntity getDownloadRecordFile(@PathVariable Long id, @@ -95,11 +102,11 @@ public ResponseEntity getDownloadRecordFile(@PathVariable Long id, } - @MethodLog - @PostMapping(value = "/submit/{type}/{id}", produces = MediaType.APPLICATION_JSON_VALUE) + @ApiOperation(value = "get download record file") + @PostMapping(value = "/submit/{type}/{id}", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public ResponseEntity submitDownloadTask(@PathVariable String type, @PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, @Valid @RequestBody(required = false) DownloadViewExecuteParam[] params, HttpServletRequest request) { List downloadViewExecuteParams = Arrays.asList(params); @@ -109,14 +116,14 @@ public ResponseEntity submitDownloadTask(@PathVariable String type, } - @MethodLog - @PostMapping(value = "/share/submit/{type}/{uuid}/{dataToken:.*}", produces = MediaType.APPLICATION_JSON_VALUE) + @ApiOperation(value = "submit share download") + @PostMapping(value = "/share/submit/{type}/{uuid}/{dataToken:.*}", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) @AuthIgnore public ResponseEntity submitShareDownloadTask(@PathVariable(name = "type") String type, @PathVariable(name = "uuid") String uuid, @PathVariable(name = "dataToken") String dataToken, @Valid @RequestBody(required = false) DownloadViewExecuteParam[] params, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { @@ -132,12 +139,12 @@ public ResponseEntity submitShareDownloadTask(@PathVariable(name = "type") Strin } - @MethodLog - @GetMapping(value = "/share/page/{uuid}/{token:.*}", produces = MediaType.APPLICATION_JSON_VALUE) + @ApiOperation(value = "get share download record page") + @GetMapping(value = "/share/page/{uuid}/{token:.*}", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) @AuthIgnore public ResponseEntity getShareDownloadRecordPage(@PathVariable(name = "uuid") String uuid, @PathVariable(name = "token") String token, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(token)) { ResultMap resultMap = new ResultMap().fail().message("Invalid share token"); @@ -154,13 +161,13 @@ public ResponseEntity getShareDownloadRecordPage(@PathVariable(name = "uuid") St } - @MethodLog + @ApiOperation(value = "get download record file") @GetMapping(value = "/share/record/file/{id}/{uuid}/{token:.*}", produces = MediaType.APPLICATION_OCTET_STREAM_VALUE) @AuthIgnore public ResponseEntity getShareDownloadRecordFile(@PathVariable(name = "id") String id, @PathVariable(name = "uuid") String uuid, @PathVariable(name = "token") String token, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request, HttpServletResponse response) { if (StringUtils.isEmpty(token)) { diff --git a/server/src/main/java/edp/davinci/controller/HomeController.java b/server/src/main/java/edp/davinci/controller/HomeController.java index 8ebd94124..a2e156dc6 100644 --- a/server/src/main/java/edp/davinci/controller/HomeController.java +++ b/server/src/main/java/edp/davinci/controller/HomeController.java @@ -19,26 +19,24 @@ package edp.davinci.controller; -import edp.core.annotation.MethodLog; import org.springframework.stereotype.Controller; import org.springframework.web.bind.annotation.RequestMapping; +import springfox.documentation.annotations.ApiIgnore; +@ApiIgnore @Controller public class HomeController { - @MethodLog @RequestMapping("swagger") public String swagger() { return "redirect:swagger-ui.html"; } - @MethodLog @RequestMapping(value = {"", "/"}) public String index() { return "index"; } - @MethodLog @RequestMapping("share/") public String shareIndex() { return "share"; diff --git a/server/src/main/java/edp/davinci/controller/ImageController.java b/server/src/main/java/edp/davinci/controller/ImageController.java index a01574aff..abbcfa925 100644 --- a/server/src/main/java/edp/davinci/controller/ImageController.java +++ b/server/src/main/java/edp/davinci/controller/ImageController.java @@ -1,14 +1,18 @@ package edp.davinci.controller; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.core.common.Constants; import edp.davinci.model.User; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.apache.commons.io.IOUtils; import org.springframework.beans.factory.annotation.Value; import org.springframework.http.MediaType; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; @@ -16,22 +20,24 @@ import java.io.FileInputStream; import java.io.IOException; +@Api(value = "/image", tags = "image", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "image not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/image", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/image", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class ImageController { @Value("${file.userfiles-path}") private String fileBasePath; - @MethodLog + @ApiOperation(value = "get display bg image") @GetMapping(value = "/display/{fileName}", produces = MediaType.IMAGE_PNG_VALUE) @ResponseBody public void getImage(@PathVariable String fileName, - @RequestParam(required = false) String username, - @CurrentUser User user, - HttpServletRequest request, - HttpServletResponse response) throws IOException { + @RequestParam(required = false) String username, + @ApiIgnore @CurrentUser User user, + HttpServletRequest request, + HttpServletResponse response) throws IOException { FileInputStream inputStream = null; try { @@ -41,9 +47,11 @@ public void getImage(@PathVariable String fileName, response.setContentType(MediaType.IMAGE_PNG_VALUE); IOUtils.copy(inputStream, response.getOutputStream()); } catch (IOException e) { - log.error("get image io error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); } catch (Exception e) { - log.error("get image error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); } finally { inputStream.close(); } diff --git a/server/src/main/java/edp/davinci/controller/LoginController.java b/server/src/main/java/edp/davinci/controller/LoginController.java index 9029f67e4..41a3a6969 100644 --- a/server/src/main/java/edp/davinci/controller/LoginController.java +++ b/server/src/main/java/edp/davinci/controller/LoginController.java @@ -19,7 +19,6 @@ package edp.davinci.controller; -import edp.core.annotation.MethodLog; import edp.core.utils.TokenUtils; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -27,6 +26,10 @@ import edp.davinci.dto.userDto.UserLoginResult; import edp.davinci.model.User; import edp.davinci.service.UserService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.core.env.Environment; @@ -37,13 +40,19 @@ import org.springframework.web.bind.annotation.RequestBody; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; +import springfox.documentation.annotations.ApiIgnore; import javax.validation.Valid; +@Api(tags = "login", basePath = Constants.BASE_API_PATH, consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses({ + @ApiResponse(code = 400, message = "pwd is wrong"), + @ApiResponse(code = 404, message = "user not found") +}) @RestController @Slf4j -@RequestMapping(value = Constants.BASE_API_PATH + "/login", consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/login", consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class LoginController { @Autowired @@ -62,9 +71,9 @@ public class LoginController { * @param bindingResult * @return */ - @MethodLog + @ApiOperation(value = "Login into the server and return token") @PostMapping - public ResponseEntity login(@Valid @RequestBody UserLogin userLogin, BindingResult bindingResult) { + public ResponseEntity login(@Valid @RequestBody UserLogin userLogin, @ApiIgnore BindingResult bindingResult) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap().fail().message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); return ResponseEntity.status(resultMap.getCode()).body(resultMap); diff --git a/server/src/main/java/edp/davinci/controller/OrganizationController.java b/server/src/main/java/edp/davinci/controller/OrganizationController.java index 51bd5b8dc..51a714eb0 100644 --- a/server/src/main/java/edp/davinci/controller/OrganizationController.java +++ b/server/src/main/java/edp/davinci/controller/OrganizationController.java @@ -22,7 +22,6 @@ import com.alibaba.druid.util.StringUtils; import com.github.pagehelper.PageInfo; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -33,6 +32,10 @@ import edp.davinci.service.OrganizationService; import edp.davinci.service.ProjectService; import edp.davinci.service.RoleService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; @@ -40,15 +43,18 @@ import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; import org.springframework.web.multipart.MultipartFile; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.List; import java.util.Map; +@Api(value = "/organization", tags = "organization", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "organization not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/organizations", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/organizations", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class OrganizationController extends BaseController { @Autowired @@ -68,11 +74,11 @@ public class OrganizationController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "create organization") @PostMapping public ResponseEntity createOrganization(@Valid @RequestBody OrganizationCreate organizationCreate, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); @@ -92,12 +98,12 @@ public ResponseEntity createOrganization(@Valid @RequestBody OrganizationCreate * @param request * @return */ - @MethodLog + @ApiOperation(value = "update organization") @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateOrganization(@PathVariable Long id, @Valid @RequestBody OrganizationPut organizationPut, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id) || !id.equals(organizationPut.getId())) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid organization id"); @@ -122,11 +128,11 @@ public ResponseEntity updateOrganization(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "upload organization avatar") @PostMapping(value = "/{id}/avatar") public ResponseEntity uploadOrgAvatar(@PathVariable Long id, @RequestParam("file") MultipartFile file, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid organization id"); @@ -151,10 +157,10 @@ public ResponseEntity uploadOrgAvatar(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete organization") @DeleteMapping("/{id}") public ResponseEntity deleteOrganization(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid organization id"); @@ -173,10 +179,10 @@ public ResponseEntity deleteOrganization(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get organization") @GetMapping("/{id}") public ResponseEntity getOrganization(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid organization id"); @@ -195,9 +201,9 @@ public ResponseEntity getOrganization(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get organizations") @GetMapping - public ResponseEntity getOrganizations(@CurrentUser User user, HttpServletRequest request) { + public ResponseEntity getOrganizations(@ApiIgnore @CurrentUser User user, HttpServletRequest request) { List organizations = organizationService.getOrganizations(user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(organizations)); } @@ -211,13 +217,13 @@ public ResponseEntity getOrganizations(@CurrentUser User user, HttpServletReques * @param request * @return */ - @MethodLog + @ApiOperation(value = "get organization projects") @GetMapping("/{id}/projects") public ResponseEntity getOrgProjects(@PathVariable Long id, @RequestParam(value = "keyword", required = false, defaultValue = "") String keyword, @RequestParam(value = "pageNum", required = false, defaultValue = "1") int pageNum, @RequestParam(value = "pageSize", required = false, defaultValue = "10") int pageSize, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid organization id"); @@ -236,7 +242,7 @@ public ResponseEntity getOrgProjects(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get organization members") @GetMapping("/{id}/members") public ResponseEntity getOrgMembers(@PathVariable Long id, HttpServletRequest request) { if (invalidId(id)) { @@ -256,10 +262,10 @@ public ResponseEntity getOrgMembers(@PathVariable Long id, HttpServletRequest re * @param request * @return */ - @MethodLog + @ApiOperation(value = "get organization roles") @GetMapping("/{id}/roles") public ResponseEntity getOrgRoles(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid organization id"); @@ -279,11 +285,11 @@ public ResponseEntity getOrgRoles(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "invite member to join the organization") @PostMapping("/{orgId}/member/{memId}") public ResponseEntity inviteMember(@PathVariable("orgId") Long orgId, @PathVariable("memId") Long memId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(orgId)) { @@ -333,10 +339,10 @@ public ResponseEntity inviteMember(@PathVariable("orgId") Long orgId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "member confirm invite") @PostMapping("/confirminvite/{token}") public ResponseEntity confirmInvite(@PathVariable("token") String token, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(token)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("The invitation confirm token can not be EMPTY"); @@ -352,10 +358,10 @@ public ResponseEntity confirmInvite(@PathVariable("token") String token, * @param relationId * @return */ - @MethodLog + @ApiOperation(value = "delete member from organization") @DeleteMapping("/member/{relationId}") public ResponseEntity deleteOrgMember(@PathVariable Long relationId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(relationId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid relation id"); @@ -375,12 +381,12 @@ public ResponseEntity deleteOrgMember(@PathVariable Long relationId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "change member role or organization", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/member/{relationId}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateMemberRole(@PathVariable Long relationId, @Valid @RequestBody OrganzationRole organzationRole, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(relationId)) { diff --git a/server/src/main/java/edp/davinci/controller/ProjectController.java b/server/src/main/java/edp/davinci/controller/ProjectController.java index f9bd04a3b..6fd8868df 100644 --- a/server/src/main/java/edp/davinci/controller/ProjectController.java +++ b/server/src/main/java/edp/davinci/controller/ProjectController.java @@ -21,7 +21,6 @@ import com.github.pagehelper.PageInfo; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -33,21 +32,28 @@ import edp.davinci.model.User; import edp.davinci.service.ProjectService; import edp.davinci.service.RoleService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.Arrays; import java.util.List; +@Api(value = "/project", tags = "project", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "project not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/projects", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/projects", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class ProjectController extends BaseController { @Autowired @@ -64,16 +70,17 @@ public class ProjectController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "get projects") @GetMapping - public ResponseEntity getProjects(@CurrentUser User user, HttpServletRequest request) { + public ResponseEntity getProjects(@ApiIgnore @CurrentUser User user, HttpServletRequest request) { List projects = projectService.getProjects(user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(projects)); } - @MethodLog + + @ApiOperation(value = "get roles where proejct is located") @GetMapping("/{id}/roles") - public ResponseEntity getRolesOfProject(@CurrentUser User user, + public ResponseEntity getRolesOfProject(@ApiIgnore @CurrentUser User user, @PathVariable Long id, HttpServletRequest request) { if (invalidId(id)) { @@ -85,9 +92,10 @@ public ResponseEntity getRolesOfProject(@CurrentUser User user, return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(list)); } - @MethodLog + + @ApiOperation(value = "get roles where proejct is located") @GetMapping("/{id}/roles/{roleId}") - public ResponseEntity getRoleOfProject(@CurrentUser User user, + public ResponseEntity getRoleOfProject(@ApiIgnore @CurrentUser User user, @PathVariable Long id, @PathVariable Long roleId, HttpServletRequest request) { @@ -109,11 +117,11 @@ public ResponseEntity getRoleOfProject(@CurrentUser User user, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get project info") @GetMapping("/{id}") public ResponseEntity getProjectInfo(@PathVariable Long id, - @CurrentUser User user, - HttpServletRequest request) { + @ApiIgnore @CurrentUser User user, + @ApiIgnore HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); @@ -129,21 +137,22 @@ public ResponseEntity getProjectInfo(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get admins of project") @GetMapping("/{id}/admins") public ResponseEntity getAdmins(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { List admins = projectService.getAdmins(id, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(admins)); } - @MethodLog + + @ApiOperation(value = "search projects by keywords") @GetMapping("/search") public ResponseEntity searchProjects(@RequestParam(value = "keywords", required = false) String keywords, @RequestParam(value = "pageNum", required = false, defaultValue = "1") int pageNum, @RequestParam(value = "pageSize", required = false, defaultValue = "10") int pageSize, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { PageInfo pageInfo = projectService.searchProjects(keywords, user, pageNum, pageSize); @@ -159,11 +168,11 @@ public ResponseEntity searchProjects(@RequestParam(value = "keywords", required * @param request * @return */ - @MethodLog + @ApiOperation(value = "create project", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createProject(@Valid @RequestBody ProjectCreat projectCreat, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); @@ -183,12 +192,12 @@ public ResponseEntity createProject(@Valid @RequestBody ProjectCreat projectCrea * @param request * @return */ - @MethodLog + @ApiOperation(value = "transfer projects", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}/transfer", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity transferProject(@PathVariable Long id, @Valid @RequestBody OrganizationTransfer organizationTransfer, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { @@ -214,17 +223,17 @@ public ResponseEntity transferProject(@PathVariable Long id, * @param request * @return */ - @MethodLog - @PostMapping("/{id}") + @ApiOperation(value = "delete project") + @DeleteMapping("/{id}") public ResponseEntity deleteProject(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - // 把任务进行归档,不进行真实删除 - projectService.setProjectToArchive(id, user); + + projectService.deleteProject(id, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request)); } @@ -238,12 +247,12 @@ public ResponseEntity deleteProject(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update project", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateProjectBaseInfo(@PathVariable Long id, @Valid @RequestBody ProjectUpdate projectUpdate, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -270,10 +279,10 @@ public ResponseEntity updateProjectBaseInfo(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "favorite project", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "/favorite/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity favoriteProject(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -292,9 +301,9 @@ public ResponseEntity favoriteProject(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get favorite projects") @GetMapping(value = "/favorites") - public ResponseEntity getFavoriteProjects(@CurrentUser User user, + public ResponseEntity getFavoriteProjects(@ApiIgnore @CurrentUser User user, HttpServletRequest request) { List favoriteProjects = projectService.getFavoriteProjects(user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(favoriteProjects)); @@ -307,9 +316,9 @@ public ResponseEntity getFavoriteProjects(@CurrentUser User user, * @param request * @return */ - @MethodLog + @ApiOperation(value = "remove favorite projects") @DeleteMapping(value = "/remove/favorites") - public ResponseEntity removeFavoriteProjects(@CurrentUser User user, + public ResponseEntity removeFavoriteProjects(@ApiIgnore @CurrentUser User user, @RequestBody Long[] projectIds, HttpServletRequest request) { for (Long id : projectIds) { @@ -332,11 +341,11 @@ public ResponseEntity removeFavoriteProjects(@CurrentUser User user, * @param request * @return */ - @MethodLog + @ApiOperation(value = "add an admin for a project") @PostMapping(value = "/{id}/admins") public ResponseEntity addProjectAdmin(@PathVariable Long id, @RequestBody Long[] adminIds, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -361,11 +370,11 @@ public ResponseEntity addProjectAdmin(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "remove an admin from a project") @DeleteMapping(value = "/{id}/admin/{relationId}") public ResponseEntity removeProjectAdmin(@PathVariable Long id, @PathVariable Long relationId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -391,11 +400,11 @@ public ResponseEntity removeProjectAdmin(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "add project role relations") @PostMapping(value = "/{id}/roles", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity addRoles(@PathVariable Long id, @RequestBody Long[] roleIds, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { diff --git a/server/src/main/java/edp/davinci/controller/RoleController.java b/server/src/main/java/edp/davinci/controller/RoleController.java index b380208a8..accce1b40 100644 --- a/server/src/main/java/edp/davinci/controller/RoleController.java +++ b/server/src/main/java/edp/davinci/controller/RoleController.java @@ -21,7 +21,6 @@ import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -29,21 +28,28 @@ import edp.davinci.model.Role; import edp.davinci.model.User; import edp.davinci.service.RoleService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.Arrays; import java.util.List; +@Api(value = "/roles", tags = "roles", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "role not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/roles", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/roles", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class RoleController extends BaseController { @Autowired @@ -59,11 +65,11 @@ public class RoleController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "create role", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createRole(@Valid @RequestBody RoleCreate role, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); @@ -84,10 +90,10 @@ public ResponseEntity createRole(@Valid @RequestBody RoleCreate role, * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete role") @DeleteMapping("/{id}") public ResponseEntity deleteRole(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -110,12 +116,12 @@ public ResponseEntity deleteRole(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update role") @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateRole(@PathVariable Long id, @Valid @RequestBody RoleUpdate role, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -141,10 +147,10 @@ public ResponseEntity updateRole(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get role") @GetMapping("/{id}") public ResponseEntity getRole(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -165,11 +171,11 @@ public ResponseEntity getRole(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "add relation between a role and members") @PostMapping(value = "/{id}/members", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity addMember(@PathVariable Long id, @RequestBody Long[] memberIds, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -189,10 +195,10 @@ public ResponseEntity addMember(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete relation between a role and a member") @DeleteMapping("/member/{relationId}") public ResponseEntity deleteMember(@PathVariable Long relationId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(relationId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid relation id"); @@ -213,11 +219,11 @@ public ResponseEntity deleteMember(@PathVariable Long relationId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update role member relations", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}/members", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateMembers(@PathVariable Long id, @RequestBody Long[] memberIds, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -242,10 +248,10 @@ public ResponseEntity updateMembers(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get members") @GetMapping("/{id}/members") public ResponseEntity getMembers(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -266,11 +272,11 @@ public ResponseEntity getMembers(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "add relation between a role and a project", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "/{id}/project/{projectId}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity addProject(@PathVariable Long id, @PathVariable Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -296,11 +302,11 @@ public ResponseEntity addProject(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete relation between a role and a project") @DeleteMapping("/{id}/project/{projectId}") public ResponseEntity deleteProject(@PathVariable Long id, @PathVariable Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid relation id"); @@ -328,13 +334,13 @@ public ResponseEntity deleteProject(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update relation between a role and a project", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}/project/{projectId}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateProjet(@PathVariable Long id, @PathVariable Long projectId, @Valid @RequestBody RelRoleProjectDto projectRole, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -365,11 +371,11 @@ public ResponseEntity updateProjet(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get role viz permission") @GetMapping(value = "/{id}/project/{projectId}/viz/visibility") public ResponseEntity getVizVisibility(@PathVariable Long id, @PathVariable Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid role id"); @@ -396,11 +402,11 @@ public ResponseEntity getVizVisibility(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "exclude role viz permission", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "/{id}/viz/visibility", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity postVizvisibility(@PathVariable Long id, @RequestBody VizVisibility vizVisibility, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { diff --git a/server/src/main/java/edp/davinci/controller/ShareController.java b/server/src/main/java/edp/davinci/controller/ShareController.java index 48b040bd4..05653c6c0 100644 --- a/server/src/main/java/edp/davinci/controller/ShareController.java +++ b/server/src/main/java/edp/davinci/controller/ShareController.java @@ -20,11 +20,9 @@ package edp.davinci.controller; import com.alibaba.druid.util.StringUtils; -import com.webank.wedatasphere.dss.visualis.query.service.VirtualViewQueryService; import edp.core.annotation.AuthIgnore; import edp.core.annotation.AuthShare; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.enums.HttpCodeEnum; import edp.core.model.Paginate; import edp.davinci.common.controller.BaseController; @@ -39,30 +37,35 @@ import edp.davinci.dto.viewDto.ViewExecuteParam; import edp.davinci.model.User; import edp.davinci.service.ShareService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; +import java.sql.SQLException; import java.util.Map; +@Api(value = "/share", tags = "share", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "resource not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/share", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/share", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class ShareController extends BaseController { @Autowired private ShareService shareService; - @Autowired - private VirtualViewQueryService virtualViewQueryService; - /** * share页登录 @@ -72,12 +75,12 @@ public class ShareController extends BaseController { * @param bindingResult * @return */ - @MethodLog + @ApiOperation(value = "share login") @AuthIgnore @PostMapping("/login/{token}") public ResponseEntity shareLogin(@PathVariable String token, @Valid @RequestBody UserLogin userLogin, - BindingResult bindingResult) { + @ApiIgnore BindingResult bindingResult) { if (StringUtils.isEmpty(token)) { ResultMap resultMap = new ResultMap().fail().message("Invalid token"); @@ -101,11 +104,11 @@ public ResponseEntity shareLogin(@PathVariable String token, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get share dashboard") @AuthShare @GetMapping("/dashboard/{token}") public ResponseEntity getShareDashboard(@PathVariable String token, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(token)) { ResultMap resultMap = new ResultMap().fail().message("Invalid share token"); @@ -129,11 +132,11 @@ public ResponseEntity getShareDashboard(@PathVariable String token, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get share display") @AuthShare @GetMapping("/display/{token}") public ResponseEntity getShareDisplay(@PathVariable String token, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(token)) { ResultMap resultMap = new ResultMap().fail().message("Invalid share token"); @@ -157,11 +160,11 @@ public ResponseEntity getShareDisplay(@PathVariable String token, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get share widget") @AuthShare @GetMapping("/widget/{token}") public ResponseEntity getShareWidget(@PathVariable String token, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(token)) { ResultMap resultMap = new ResultMap().fail().message("Invalid share token"); @@ -186,26 +189,20 @@ public ResponseEntity getShareWidget(@PathVariable String token, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get share data") @AuthShare @PostMapping(value = "/data/{token}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity getShareData(@PathVariable String token, @RequestBody(required = false) ViewExecuteParam executeParam, - @CurrentUser User user, - HttpServletRequest request) throws Exception { + @ApiIgnore @CurrentUser User user, + HttpServletRequest request) throws SQLException { if (StringUtils.isEmpty(token)) { ResultMap resultMap = new ResultMap().fail().message("Invalid share token"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - Paginate> shareData; - if(executeParam.getView() == null){ - shareData = shareService.getShareData(token, executeParam, user, request); - } else { - shareData = virtualViewQueryService.getData(executeParam, user, true); - } - + Paginate> shareData = shareService.getShareData(token, executeParam, user); if (null == user) { return ResponseEntity.ok(new ResultMap().success().payload(shareData)); } else { @@ -225,14 +222,14 @@ public ResponseEntity getShareData(@PathVariable String token, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get share data") @AuthShare @PostMapping(value = "/data/{token}/distinctvalue/{viewId}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity getDistinctValue(@PathVariable("token") String token, @PathVariable("viewId") Long viewId, @Valid @RequestBody DistinctParam param, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(token)) { @@ -254,7 +251,8 @@ public ResponseEntity getDistinctValue(@PathVariable("token") String token, ResultMap resultMap = shareService.getDistinctValue(token, viewId, param, user, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("share token error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -269,12 +267,12 @@ public ResponseEntity getDistinctValue(@PathVariable("token") String token, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get share data csv") @AuthShare @PostMapping(value = "/csv/{token}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity generationShareDataCsv(@PathVariable String token, @RequestBody(required = false) ViewExecuteParam executeParam, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(token)) { diff --git a/server/src/main/java/edp/davinci/controller/SourceController.java b/server/src/main/java/edp/davinci/controller/SourceController.java index 1ec8df0bd..4802684ec 100644 --- a/server/src/main/java/edp/davinci/controller/SourceController.java +++ b/server/src/main/java/edp/davinci/controller/SourceController.java @@ -20,10 +20,8 @@ package edp.davinci.controller; import com.alibaba.druid.util.StringUtils; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; import com.webank.wedatasphere.dss.visualis.utils.HttpUtils; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.model.DBTables; import edp.core.model.TableInfo; import edp.davinci.common.controller.BaseController; @@ -33,30 +31,34 @@ import edp.davinci.model.Source; import edp.davinci.model.User; import edp.davinci.service.SourceService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.BeanUtils; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; import org.springframework.web.multipart.MultipartFile; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.List; +@Api(value = "/sources", tags = "sources", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "sources not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/sources", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/sources", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class SourceController extends BaseController { @Autowired private SourceService sourceService; - @Autowired - private ProjectAuth projectAuth; /** * 获取source列表 @@ -66,10 +68,10 @@ public class SourceController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "get sources") @GetMapping public ResponseEntity getSources(@RequestParam Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(projectId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -88,10 +90,10 @@ public ResponseEntity getSources(@RequestParam Long projectId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get source detail") @GetMapping("/{id}") public ResponseEntity getSourceDetail(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -111,11 +113,11 @@ public ResponseEntity getSourceDetail(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "create source", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createSource(@Valid @RequestBody SourceCreate source, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -123,10 +125,6 @@ public ResponseEntity createSource(@Valid @RequestBody SourceCreate source, return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - if(!projectAuth.isPorjectOwner(source.getProjectId(), user.getId())) { - return ResponseEntity.status(HttpStatus.UNAUTHORIZED).build(); - } - Source record = sourceService.createSource(source, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(record)); } @@ -142,12 +140,12 @@ public ResponseEntity createSource(@Valid @RequestBody SourceCreate source, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update a source", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateSource(@PathVariable Long id, @Valid @RequestBody SourceInfo source, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { @@ -173,10 +171,10 @@ public ResponseEntity updateSource(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "delete a source") @DeleteMapping("/{id}") public ResponseEntity deleteSource(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -198,11 +196,11 @@ public ResponseEntity deleteSource(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "test source", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "/test", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity testSource(@Valid @RequestBody SourceTest sourceTest, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -222,10 +220,10 @@ public ResponseEntity testSource(@Valid @RequestBody SourceTest sourceTest, * @param request * @return */ - @MethodLog + @ApiOperation(value = "release and reconnect", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "/reconnect/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity reconnect(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { sourceService.reconnect(id, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request)); @@ -242,12 +240,12 @@ public ResponseEntity reconnect(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "create csv meta", consumes = MediaType.APPLICATION_JSON_VALUE) @PostMapping(value = "{id}/csvmeta", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createCsvmeta(@PathVariable Long id, @Valid @RequestBody UploadMeta uploadMeta, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -275,14 +273,14 @@ public ResponseEntity createCsvmeta(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "upload csv/excel file") @PostMapping("{id}/upload{type}") public ResponseEntity uploadData(@PathVariable Long id, @PathVariable String type, @Valid @ModelAttribute(value = "sourceDataUpload") SourceDataUpload sourceDataUpload, - BindingResult bindingResult, + @ApiIgnore BindingResult bindingResult, @RequestParam("file") MultipartFile file, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -313,10 +311,10 @@ public ResponseEntity uploadData(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get dbs") @GetMapping("/{id}/databases") public ResponseEntity getSourceDbs(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Inavlid source id"); @@ -336,11 +334,11 @@ public ResponseEntity getSourceDbs(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get tables") @GetMapping("/{id}/tables") public ResponseEntity getSourceTables(@PathVariable Long id, @RequestParam(name = "dbName") String dbName, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Inavlid source id"); @@ -363,12 +361,12 @@ public ResponseEntity getSourceTables(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get columns") @GetMapping("/{id}/table/columns") public ResponseEntity getTableColumns(@PathVariable Long id, @RequestParam(name = "dbName") String dbName, @RequestParam(name = "tableName") String tableName, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Inavlid source id"); @@ -398,9 +396,9 @@ public ResponseEntity getTableColumns(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get jdbc datasources") @GetMapping("/jdbc/datasources") - public ResponseEntity getJdbcDataSources(@CurrentUser User user, HttpServletRequest request) { + public ResponseEntity getJdbcDataSources(@ApiIgnore @CurrentUser User user, HttpServletRequest request) { List list = sourceService.getDatasources(); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(list)); } diff --git a/server/src/main/java/edp/davinci/controller/StarController.java b/server/src/main/java/edp/davinci/controller/StarController.java index 5e17843fd..f88046cbf 100644 --- a/server/src/main/java/edp/davinci/controller/StarController.java +++ b/server/src/main/java/edp/davinci/controller/StarController.java @@ -20,34 +20,40 @@ package edp.davinci.controller; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.enums.HttpCodeEnum; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; import edp.davinci.model.User; import edp.davinci.service.StarService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; +@Api(value = "/star", tags = "star", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "star not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/star", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/star", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class StarController extends BaseController { @Autowired private StarService starService; - @MethodLog + @ApiOperation(value = "star or unstar project") @PostMapping("/project/{id}") public ResponseEntity starProject(@PathVariable Long id, - @CurrentUser User user, - HttpServletRequest request) { + @ApiIgnore @CurrentUser User user, + @ApiIgnore HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); @@ -57,15 +63,16 @@ public ResponseEntity starProject(@PathVariable Long id, ResultMap resultMap = starService.starAndUnstar(Constants.STAR_TARGET_PROJECT, id, user, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("star project error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } - @MethodLog + @ApiOperation(value = "get project star user list") @GetMapping("/project/{id}") public ResponseEntity getStarUsers(@PathVariable Long id, - HttpServletRequest request) { + @ApiIgnore HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); @@ -75,21 +82,23 @@ public ResponseEntity getStarUsers(@PathVariable Long id, ResultMap resultMap = starService.getStarUserListByTarget(Constants.STAR_TARGET_PROJECT, id, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("get star user error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } - @MethodLog + @ApiOperation(value = "get my star project list") @GetMapping("/mystar/project") - public ResponseEntity getMyStarProjects(@CurrentUser User user, - HttpServletRequest request) { + public ResponseEntity getMyStarProjects(@ApiIgnore @CurrentUser User user, + @ApiIgnore HttpServletRequest request) { try { ResultMap resultMap = starService.getStarListByUser(Constants.STAR_TARGET_PROJECT, user, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("get mystar project error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } diff --git a/server/src/main/java/edp/davinci/controller/StatisticController.java b/server/src/main/java/edp/davinci/controller/StatisticController.java index bba6429b3..859a092bb 100644 --- a/server/src/main/java/edp/davinci/controller/StatisticController.java +++ b/server/src/main/java/edp/davinci/controller/StatisticController.java @@ -17,7 +17,6 @@ */ package edp.davinci.controller; -import edp.core.annotation.MethodLog; import edp.core.utils.TokenUtils; import edp.davinci.common.model.ValidList; import edp.davinci.core.common.Constants; @@ -26,6 +25,10 @@ import edp.davinci.dto.statistic.DavinciStatisticTerminalInfo; import edp.davinci.dto.statistic.DavinciStatisticVisitorOperationInfo; import edp.davinci.service.BuriedPointsService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; @@ -38,9 +41,11 @@ import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; +@Api(value = "/statistic", tags = "statistic", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "statistic not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/statistic", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/statistic", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class StatisticController { @Autowired @@ -49,6 +54,7 @@ public class StatisticController { @Autowired public TokenUtils tokenUtils; + @ApiOperation(value = "collect duration info ") @PostMapping(value = "/duration", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity collectDurationInfo(@Valid @RequestBody ValidList durationInfos, HttpServletRequest request){ @@ -58,7 +64,7 @@ public ResponseEntity collectDurationInfo(@Valid @RequestBody ValidList terminalInfoInfos, HttpServletRequest request){ @@ -68,7 +74,7 @@ public ResponseEntity collectTerminalInfo(@Valid @RequestBody ValidList visitorOperationInfos, HttpServletRequest request){ diff --git a/server/src/main/java/edp/davinci/controller/UserController.java b/server/src/main/java/edp/davinci/controller/UserController.java index 1d860fbb0..ecd2b78eb 100644 --- a/server/src/main/java/edp/davinci/controller/UserController.java +++ b/server/src/main/java/edp/davinci/controller/UserController.java @@ -22,7 +22,6 @@ import com.alibaba.druid.util.StringUtils; import edp.core.annotation.AuthIgnore; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.enums.HttpCodeEnum; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; @@ -30,6 +29,10 @@ import edp.davinci.dto.userDto.*; import edp.davinci.model.User; import edp.davinci.service.UserService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.BeanUtils; import org.springframework.beans.factory.annotation.Autowired; @@ -38,14 +41,17 @@ import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; import org.springframework.web.multipart.MultipartFile; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.List; +@Api(value = "/users", tags = "users", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "user not found")) @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/users", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/users", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) @Slf4j public class UserController extends BaseController { @@ -59,10 +65,10 @@ public class UserController extends BaseController { * @param bindingResult * @return */ - @MethodLog + @ApiOperation(value = "insert user") @AuthIgnore @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) - public ResponseEntity regist(@Valid @RequestBody UserRegist userRegist, BindingResult bindingResult) { + public ResponseEntity regist(@Valid @RequestBody UserRegist userRegist, @ApiIgnore BindingResult bindingResult) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap().fail().message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); @@ -80,7 +86,7 @@ public ResponseEntity regist(@Valid @RequestBody UserRegist userRegist, BindingR * @param request * @return */ - @MethodLog + @ApiOperation(value = "active user") @AuthIgnore @PostMapping(value = "/active/{token}") public ResponseEntity activate(@PathVariable String token, @@ -95,7 +101,8 @@ public ResponseEntity activate(@PathVariable String token, ResultMap resultMap = userService.activateUserNoLogin(token, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("user active error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -137,11 +144,11 @@ public ResponseEntity activate(@PathVariable String token, * @param request * @return */ - @MethodLog + @ApiOperation(value = "user active sendmail") @PostMapping(value = "/sendmail", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity sendMail(@Valid @RequestBody SendMail sendMail, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -162,11 +169,11 @@ public ResponseEntity sendMail(@Valid @RequestBody SendMail sendMail, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update user info", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity putUser(@PathVariable Long id, @RequestBody UserPut userPut, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid user id"); @@ -191,12 +198,12 @@ public ResponseEntity putUser(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "change user password", consumes = MediaType.APPLICATION_JSON_VALUE) @PutMapping(value = "/{id}/changepassword", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity changeUserPassword(@PathVariable Long id, @Valid @RequestBody ChangePassword changePassword, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -217,7 +224,8 @@ public ResponseEntity changeUserPassword(@PathVariable Long id, ResultMap resultMap = userService.changeUserPassword(user, changePassword.getOldPassword(), changePassword.getPassword(), request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("change password error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -231,11 +239,11 @@ public ResponseEntity changeUserPassword(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "upload avatar") @PostMapping(value = "/{id}/avatar") public ResponseEntity uploadAvatar(@PathVariable Long id, @RequestParam("file") MultipartFile file, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -257,7 +265,8 @@ public ResponseEntity uploadAvatar(@PathVariable Long id, ResultMap resultMap = userService.uploadAvatar(user, file, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("avatar user error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } @@ -270,11 +279,11 @@ public ResponseEntity uploadAvatar(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get users by keyword") @GetMapping public ResponseEntity getUsers(@RequestParam("keyword") String keyword, @RequestParam(value = "orgId", required = false) Long orgId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (StringUtils.isEmpty(keyword)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("keyword can not EMPTY"); @@ -291,10 +300,10 @@ public ResponseEntity getUsers(@RequestParam("keyword") String keyword, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get user profile") @GetMapping("/profile/{id}") public ResponseEntity getUser(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid user id"); @@ -304,7 +313,8 @@ public ResponseEntity getUser(@PathVariable Long id, ResultMap resultMap = userService.getUserProfile(id, user, request); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } catch (Exception e) { - log.error("search user error: " + e); + e.printStackTrace(); + log.error(e.getMessage()); return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); } } diff --git a/server/src/main/java/edp/davinci/controller/ViewController.java b/server/src/main/java/edp/davinci/controller/ViewController.java index 42d9734d7..985c5cd9e 100644 --- a/server/src/main/java/edp/davinci/controller/ViewController.java +++ b/server/src/main/java/edp/davinci/controller/ViewController.java @@ -19,15 +19,8 @@ package edp.davinci.controller; -import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; -import com.webank.wedatasphere.dss.visualis.query.service.VirtualViewQueryService; -import com.webank.wedatasphere.dss.visualis.query.utils.EnvLimitUtils; -import com.webank.wedatasphere.dss.visualis.query.utils.QueryUtils; -import com.webank.wedatasphere.dss.visualis.utils.HiveDBHelper; -import com.webank.wedatasphere.dss.visualis.utils.HttpUtils; +import com.webank.wedatasphere.dss.visualis.service.hive.HiveDBHelper; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.core.model.Paginate; import edp.core.model.PaginateWithQueryColumns; import edp.davinci.common.controller.BaseController; @@ -38,42 +31,40 @@ import edp.davinci.model.DacChannel; import edp.davinci.model.User; import edp.davinci.service.ViewService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang.StringUtils; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.CacheControl; -import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; +import java.sql.SQLException; import java.util.List; import java.util.Map; +@Api(value = "/views", tags = "views", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "view not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/views", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/views", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class ViewController extends BaseController { @Autowired private ViewService viewService; - @Autowired - private VirtualViewQueryService virtualViewQueryService; - @Autowired private DacChannelUtil dacChannelUtil; - @Autowired private HiveDBHelper hiveDBHelper; - @Autowired - private ProjectAuth projectAuth; - - // 工作流创建widget时调用Step 3 /** * 获取view * @@ -82,13 +73,11 @@ public class ViewController extends BaseController { * @param request * @return */ - @MethodLog + @ApiOperation(value = "get views") @GetMapping public ResponseEntity getViews(@RequestParam Long projectId, - @RequestParam(required = false) String contextId, - @RequestParam(required = false) String nodeName, - @CurrentUser User user, - HttpServletRequest request) throws Exception { + @ApiIgnore @CurrentUser User user, + HttpServletRequest request) { if (invalidId(projectId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -96,12 +85,6 @@ public ResponseEntity getViews(@RequestParam Long projectId, } List views = viewService.getViews(projectId, user); - if (StringUtils.isNotBlank(contextId) && StringUtils.isNotBlank(nodeName)) { - List virtualviews = Lists.newArrayList(); - virtualviews.addAll(QueryUtils.getFromContext(contextId, nodeName)); - virtualviews.addAll(views); - return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(virtualviews)); - } return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(views)); } @@ -114,10 +97,10 @@ public ResponseEntity getViews(@RequestParam Long projectId, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get view info") @GetMapping("/{id}") public ResponseEntity getView(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -139,11 +122,11 @@ public ResponseEntity getView(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "create view") @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createView(@Valid @RequestBody ViewCreate view, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -151,16 +134,7 @@ public ResponseEntity createView(@Valid @RequestBody ViewCreate view, return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - if (EnvLimitUtils.notPermitted()) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(EnvLimitUtils.ERROR_MESSAGE); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - - if(!projectAuth.isPorjectOwner(view.getProjectId(), user.getId())) { - return ResponseEntity.status(HttpStatus.UNAUTHORIZED).build(); - } - - ViewWithSourceBaseInfo viewWithSourceBaseInfo = viewService.createView(view, user, HttpUtils.getUserTicketId(request)); + ViewWithSourceBaseInfo viewWithSourceBaseInfo = viewService.createView(view, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(viewWithSourceBaseInfo)); } @@ -176,12 +150,12 @@ public ResponseEntity createView(@Valid @RequestBody ViewCreate view, * @param request * @return */ - @MethodLog + @ApiOperation(value = "update view") @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateView(@PathVariable Long id, @Valid @RequestBody ViewUpdate viewUpdate, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { @@ -195,11 +169,6 @@ public ResponseEntity updateView(@PathVariable Long id, return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - if (EnvLimitUtils.notPermitted()) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(EnvLimitUtils.ERROR_MESSAGE); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - viewService.updateView(viewUpdate, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request)); } @@ -213,24 +182,84 @@ public ResponseEntity updateView(@PathVariable Long id, * @param request * @return */ - @MethodLog - @PostMapping("/{id}") + @ApiOperation(value = "delete view") + @DeleteMapping("/{id}") public ResponseEntity deleteView(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid view id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - if (EnvLimitUtils.notPermitted()) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(EnvLimitUtils.ERROR_MESSAGE); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - viewService.deleteView(id, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request)); + + + //TODO Hive related logic, need to be reconsidered +//======= +// try { +// ResultMap resultMap = viewService.deleteView(id, user, request); +// return ResponseEntity.status(resultMap.getCode()).body(resultMap); +// } catch (Exception e) { +// e.printStackTrace(); +// log.error(e.getMessage()); +// return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); +// } +// } +// +// /** +// * 获取数据库schema信息 +// * +// * @param sourceId +// * @param user +// * @param request +// * @return +// */ +// @ApiOperation(value = "get view data schema") +// @GetMapping("/database") +// public ResponseEntity getSourceSchema(@RequestParam String sourceId, +// @ApiIgnore @CurrentUser User user, +// HttpServletRequest request) { +// /* +// *如果sourceID是hive_ 开头的话,那么就是hive的数据库,否则就是jdbc的数据库 +// * */ +// if (StringUtils.isEmpty(sourceId)) { +// log.warn("sourceId is empty, can not get right source for user: {}", user); +// ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Empty source id"); +// return ResponseEntity.status(resultMap.getCode()).body(resultMap); +// } +// +// //update by johnnwang +// if (sourceId.startsWith(HiveDBHelper.HIVE_PREFIX)) { +// //deal hive source +// try { +// ResultMap resultMap = hiveDBHelper.getHiveSourceSchema(sourceId, user, request); +// return ResponseEntity.status(resultMap.getCode()).body(resultMap); +// } catch (final Exception e) { +// log.error("获取sourceId {} 的hive数据库表信息失败", sourceId, e); +// return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); +// } +// } else if (VGUtils.getHiveDataSourceId() == Long.parseLong(sourceId)) { +// ResultMap resultMap = new ResultMap(tokenUtils).success(); +// return ResponseEntity.status(resultMap.getCode()).body(resultMap); +// } else { +// Long longSourceId = Long.parseLong(sourceId); +// if (invalidId(longSourceId)) { +// ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Inavlid source id"); +// return ResponseEntity.status(resultMap.getCode()).body(resultMap); +// } +// try { +// ResultMap resultMap = viewService.getSourceSchema(longSourceId, user, request); +// return ResponseEntity.status(resultMap.getCode()).body(resultMap); +// } catch (Exception e) { +// e.printStackTrace(); +// log.error(e.getMessage()); +// return ResponseEntity.status(HttpCodeEnum.SERVER_ERROR.getCode()).body(HttpCodeEnum.SERVER_ERROR.getMessage()); +// } +// } +//>>>>>>> drawis } @@ -243,11 +272,11 @@ public ResponseEntity deleteView(@PathVariable Long id, * @param request * @return */ - @MethodLog + @ApiOperation(value = "executesql") @PostMapping(value = "/executesql", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity executeSql(@Valid @RequestBody ViewExecuteSql executeSql, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -255,11 +284,6 @@ public ResponseEntity executeSql(@Valid @RequestBody ViewExecuteSql executeSql, return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - if (EnvLimitUtils.notPermitted()) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(EnvLimitUtils.ERROR_MESSAGE); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - PaginateWithQueryColumns paginateWithQueryColumns = viewService.executeSql(executeSql, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(paginateWithQueryColumns)); } @@ -274,95 +298,30 @@ public ResponseEntity executeSql(@Valid @RequestBody ViewExecuteSql executeSql, * @param request * @return */ - @MethodLog + @ApiOperation(value = "get data") @PostMapping(value = "/{id}/getdata", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity getData(@PathVariable Long id, @RequestBody(required = false) ViewExecuteParam executeParam, - @CurrentUser User user, - HttpServletRequest request) throws Exception { - if (invalidId(id) && executeParam.getView() == null) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid view id"); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - - Paginate> paginate; - if (executeParam.getView() == null) { - paginate = viewService.getData(id, executeParam, user, true); - } else { - paginate = virtualViewQueryService.getData(executeParam, user, true); - } - return ResponseEntity.ok().cacheControl(CacheControl.noCache()).body(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(paginate)); - } - - @MethodLog - @PostMapping(value = "/{id}/getprogress", consumes = MediaType.APPLICATION_JSON_VALUE) - public ResponseEntity getProgress(@PathVariable String id, - @RequestBody(required = false) ViewExecuteParam executeParam, - @CurrentUser User user, - HttpServletRequest request) throws Exception { + @ApiIgnore @CurrentUser User user, + HttpServletRequest request) throws SQLException { if (invalidId(id)) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid exec id"); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - - Paginate> paginate = viewService.getAsyncProgress(id, user); - return ResponseEntity.ok().cacheControl(CacheControl.noCache()).body(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(paginate)); - } - - @MethodLog - @PostMapping(value = "/{id}/kill", consumes = MediaType.APPLICATION_JSON_VALUE) - public ResponseEntity kill(@PathVariable String id, - @RequestBody(required = false) ViewExecuteParam executeParam, - @CurrentUser User user, - HttpServletRequest request) throws Exception { - if (invalidId(id)) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid exec id"); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - - Paginate> paginate = viewService.killAsyncJob(id, user); - return ResponseEntity.ok().cacheControl(CacheControl.noCache()).body(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(paginate)); - } - - @MethodLog - @PostMapping(value = "/{id}/getresult", consumes = MediaType.APPLICATION_JSON_VALUE) - public ResponseEntity getResult(@PathVariable String id, - @RequestBody(required = false) ViewExecuteParam executeParam, - @CurrentUser User user, - HttpServletRequest request) throws Exception { - if (invalidId(id)) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid exec id"); + ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid view id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - Paginate> paginate = viewService.getAsyncResult(id, user); - if (executeParam == null || executeParam.getPageNo() == -1) { - paginate.setPageNo(1); - paginate.setPageSize(new Long(paginate.getTotalCount()).intValue()); - } else { - paginate.setPageNo(executeParam.getPageNo()); - paginate.setPageSize(executeParam.getPageSize()); - } + Paginate> paginate = viewService.getData(id, executeParam, user); return ResponseEntity.ok().cacheControl(CacheControl.noCache()).body(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(paginate)); } - @MethodLog - @PostMapping(value = "/getdistinctvalue", consumes = MediaType.APPLICATION_JSON_VALUE) - public ResponseEntity getDistinctValueNoView(@Valid @RequestBody DistinctParam param, - BindingResult bindingResult, - @CurrentUser User user, - HttpServletRequest request) throws Exception { - return getDistinctValue(null, param, bindingResult, user, request); - } - @MethodLog + @ApiOperation(value = "get distinct value") @PostMapping(value = "/{id}/getdistinctvalue", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity getDistinctValue(@PathVariable Long id, @Valid @RequestBody DistinctParam param, - BindingResult bindingResult, - @CurrentUser User user, - HttpServletRequest request) throws Exception { - if (invalidId(id) && param.getView() == null) { + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, + HttpServletRequest request) { + if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid view id"); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } @@ -372,34 +331,31 @@ public ResponseEntity getDistinctValue(@PathVariable Long id, return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - List> distinctValue = Lists.newArrayList(); - if (param.getView() == null) { - distinctValue = viewService.getDistinctValue(id, param, user); - } else { - distinctValue = virtualViewQueryService.getDistinctValue(param, user); - } + List> distinctValue = viewService.getDistinctValue(id, param, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(distinctValue)); } - @MethodLog + + @ApiOperation(value = "get dac channels") @GetMapping("/dac/channels") - public ResponseEntity getDacChannels(@CurrentUser User user, HttpServletRequest request) { + public ResponseEntity getDacChannels(@ApiIgnore @CurrentUser User user, HttpServletRequest request) { Map dacMap = DacChannelUtil.dacMap; return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(dacMap.keySet())); } - @MethodLog + @ApiOperation(value = "get dac tenants") @GetMapping("/dac/{dacName}/tenants") - public ResponseEntity getDacTannets(@PathVariable String dacName, @CurrentUser User user, HttpServletRequest request) { + public ResponseEntity getDacTannets(@PathVariable String dacName, @ApiIgnore @CurrentUser User user, HttpServletRequest request) { return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(dacChannelUtil.getTenants(dacName))); } - @MethodLog + + @ApiOperation(value = "get dac bizs") @GetMapping("/dac/{dacName}/tenants/{tenantId}/bizs") public ResponseEntity getDacBizs(@PathVariable String dacName, @PathVariable String tenantId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(dacChannelUtil.getBizs(dacName, tenantId))); } diff --git a/server/src/main/java/edp/davinci/controller/WidgetController.java b/server/src/main/java/edp/davinci/controller/WidgetController.java index ca665a1f0..cfa5f8db4 100644 --- a/server/src/main/java/edp/davinci/controller/WidgetController.java +++ b/server/src/main/java/edp/davinci/controller/WidgetController.java @@ -20,9 +20,7 @@ package edp.davinci.controller; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; import edp.core.annotation.CurrentUser; -import edp.core.annotation.MethodLog; import edp.davinci.common.controller.BaseController; import edp.davinci.core.common.Constants; import edp.davinci.core.common.ResultMap; @@ -32,33 +30,44 @@ import edp.davinci.model.User; import edp.davinci.model.Widget; import edp.davinci.service.WidgetService; +import io.swagger.annotations.Api; +import io.swagger.annotations.ApiOperation; +import io.swagger.annotations.ApiResponse; +import io.swagger.annotations.ApiResponses; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; import org.springframework.http.ResponseEntity; import org.springframework.validation.BindingResult; import org.springframework.web.bind.annotation.*; +import springfox.documentation.annotations.ApiIgnore; import javax.servlet.http.HttpServletRequest; import javax.validation.Valid; import java.util.List; +@Api(value = "/widgets", tags = "widgets", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) +@ApiResponses(@ApiResponse(code = 404, message = "widget not found")) @Slf4j @RestController -@RequestMapping(value = Constants.BASE_API_PATH + "/widgets", produces = MediaType.APPLICATION_JSON_VALUE) +@RequestMapping(value = Constants.BASE_API_PATH + "/widgets", produces = MediaType.APPLICATION_JSON_UTF8_VALUE) public class WidgetController extends BaseController { @Autowired private WidgetService widgetService; - @Autowired - private ProjectAuth projectAuth; - - @MethodLog + /** + * 获取widget列表 + * + * @param projectId + * @param user + * @param request + * @return + */ + @ApiOperation(value = "get widgets") @GetMapping public ResponseEntity getWidgets(@RequestParam Long projectId, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(projectId)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); @@ -70,10 +79,18 @@ public ResponseEntity getWidgets(@RequestParam Long projectId, } - @MethodLog + /** + * 获取widget列表 + * + * @param id + * @param user + * @param request + * @return + */ + @ApiOperation(value = "get widget info") @GetMapping("/{id}") public ResponseEntity getWidgetInfo(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid id"); @@ -83,32 +100,48 @@ public ResponseEntity getWidgetInfo(@PathVariable Long id, return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(widget)); } - @MethodLog + + /** + * 新建widget + * + * @param widget + * @param bindingResult + * @param user + * @param request + * @return + */ + @ApiOperation(value = "create widget") @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity createWidgets(@Valid @RequestBody WidgetCreate widget, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message(bindingResult.getFieldErrors().get(0).getDefaultMessage()); return ResponseEntity.status(resultMap.getCode()).body(resultMap); } - - if(!projectAuth.isPorjectOwner(widget.getProjectId(), user.getId())) { - return ResponseEntity.status(HttpStatus.UNAUTHORIZED).build(); - } - Widget newWidget = widgetService.createWidget(widget, user); return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payload(newWidget)); } - @MethodLog + + /** + * 修改widget + * + * @param id + * @param widget + * @param bindingResult + * @param user + * @param request + * @return + */ + @ApiOperation(value = "update widget") @PutMapping(value = "/{id}", consumes = MediaType.APPLICATION_JSON_VALUE) public ResponseEntity updateWidget(@PathVariable Long id, @Valid @RequestBody WidgetUpdate widget, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (bindingResult.hasErrors()) { @@ -126,10 +159,18 @@ public ResponseEntity updateWidget(@PathVariable Long id, } - @MethodLog - @PostMapping("/{id}") + /** + * 删除widget + * + * @param id + * @param user + * @param request + * @return + */ + @ApiOperation(value = "delete widget") + @DeleteMapping("/{id}") public ResponseEntity deleteWidget(@PathVariable Long id, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { @@ -142,13 +183,21 @@ public ResponseEntity deleteWidget(@PathVariable Long id, } - @MethodLog + /** + * 下载widget + * + * @param id + * @param user + * @param request + * @return + */ + @ApiOperation(value = "download widget") @PostMapping("/{id}/{type}") public ResponseEntity downloadWidget(@PathVariable("id") Long id, @PathVariable("type") String type, @Valid @RequestBody ViewExecuteParam executeParam, - BindingResult bindingResult, - @CurrentUser User user, + @ApiIgnore BindingResult bindingResult, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid id"); @@ -165,11 +214,20 @@ public ResponseEntity downloadWidget(@PathVariable("id") Long id, } - @MethodLog + /** + * 分享widget + * + * @param id + * @param username + * @param user + * @param request + * @return + */ + @ApiOperation(value = "share widget") @GetMapping("/{id}/share") public ResponseEntity shareWidget(@PathVariable Long id, @RequestParam(required = false) String username, - @CurrentUser User user, + @ApiIgnore @CurrentUser User user, HttpServletRequest request) { if (invalidId(id)) { ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid id"); diff --git a/server/src/main/java/edp/davinci/core/common/Constants.java b/server/src/main/java/edp/davinci/core/common/Constants.java index 92f222d48..1f536dcb7 100644 --- a/server/src/main/java/edp/davinci/core/common/Constants.java +++ b/server/src/main/java/edp/davinci/core/common/Constants.java @@ -39,7 +39,7 @@ public class Constants extends Consts { */ public static final String AUTH_API_PATH = "/auth/v3"; - public static final String RESTFUL_BASE_PATH = "/visualis/"; + public static final String RESTFUL_BASE_PATH ="/visualis/"; /** * 用户激活 / 重发激活邮件模板 @@ -139,7 +139,7 @@ public class Constants extends Consts { public static final String REG_AUTHVAR = "\\([a-zA-Z0-9_.-[\\u4e00-\\u9fa5]*]+\\s*[\\w<>!=]*\\s*[a-zA-Z0-9_.-]*((\\(%s[a-zA-Z0-9_]+%s\\))|(%s[a-zA-Z0-9_]+%s))+\\s*\\)"; - public static final String REG_QUERYVAR = "\\$\\{([a-zA-Z0-9_.*]|[\\u4e00-\\u9fa5])+\\}"; + public static final String REG_QUERYVAR = "\\$\\{([a-zA-Z0-9_.*]|[\\u4e00-\\u9fa5])+\\}"; public static final String REG_CHINESE = "[\\u4e00-\\u9fa5]+"; @@ -150,10 +150,6 @@ public class Constants extends Consts { public static final String N0_AUTH_PERMISSION = "@DAVINCI_DATA_ACCESS_DENIED@"; public static final String DAVINCI_TOPIC_CHANNEL = "DAVINCI_TOPIC_CHANNEL"; - /** - * mysql text类型最大长度 - */ - public static final int TEXT_MAX_LENGTH = 65535; public static char getSqlTempDelimiter(String sqlTempDelimiter) { diff --git a/server/src/main/java/edp/davinci/core/config/RestExceptionHandler.java b/server/src/main/java/edp/davinci/core/config/RestExceptionHandler.java index 056009aa1..02712876a 100644 --- a/server/src/main/java/edp/davinci/core/config/RestExceptionHandler.java +++ b/server/src/main/java/edp/davinci/core/config/RestExceptionHandler.java @@ -50,6 +50,7 @@ public class RestExceptionHandler { @ResponseBody @ResponseStatus(HttpStatus.BAD_REQUEST) private ResultMap commonExceptionHandler(HttpServletRequest request, Exception e) { + e.printStackTrace(); log.error(e.getMessage()); return new ResultMap(tokenUtils).failAndRefreshToken(request).message(HttpStatus.INTERNAL_SERVER_ERROR.getReasonPhrase()); } @@ -58,6 +59,7 @@ private ResultMap commonExceptionHandler(HttpServletRequest request, Exception e @ResponseBody @ResponseStatus(HttpStatus.BAD_REQUEST) private ResultMap serverExceptionHandler(HttpServletRequest request, Exception e) { + e.printStackTrace(); log.error(e.getMessage()); return new ResultMap(tokenUtils).failAndRefreshToken(request).message(e.getMessage()); } diff --git a/server/src/main/java/edp/davinci/core/config/WebMvcConfig.java b/server/src/main/java/edp/davinci/core/config/WebMvcConfig.java index 71176ad96..91777ba89 100644 --- a/server/src/main/java/edp/davinci/core/config/WebMvcConfig.java +++ b/server/src/main/java/edp/davinci/core/config/WebMvcConfig.java @@ -151,7 +151,6 @@ public void addResourceHandlers(ResourceHandlerRegistry registry) { } - @SuppressWarnings("unchecked") @Override protected void configureMessageConverters(List> converters) { FastJsonHttpMessageConverter fastConverter = new FastJsonHttpMessageConverter(); @@ -171,7 +170,7 @@ protected void configureMessageConverters(List> converte //处理中文乱码问题 List fastMediaTypes = new ArrayList<>(); - fastMediaTypes.add(MediaType.APPLICATION_JSON); + fastMediaTypes.add(MediaType.APPLICATION_JSON_UTF8); fastMediaTypes.add(MediaType.IMAGE_PNG); fastConverter.setSupportedMediaTypes(fastMediaTypes); fastConverter.setFastJsonConfig(fastJsonConfig); diff --git a/server/src/main/java/edp/davinci/core/enums/DownloadTaskStatus.java b/server/src/main/java/edp/davinci/core/enums/DownloadTaskStatus.java index 4c0ec4fe5..b7c05e244 100644 --- a/server/src/main/java/edp/davinci/core/enums/DownloadTaskStatus.java +++ b/server/src/main/java/edp/davinci/core/enums/DownloadTaskStatus.java @@ -19,7 +19,13 @@ package edp.davinci.core.enums; - +/** + * Created by IntelliJ IDEA. + * + * @Author daemon + * @Date 19/5/30 10:13 + * To change this template use File | Settings | File Templates. + */ public enum DownloadTaskStatus { PROCESSING((short) 1), SUCCESS((short) 2), diff --git a/server/src/main/java/edp/davinci/core/enums/DownloadType.java b/server/src/main/java/edp/davinci/core/enums/DownloadType.java index 3b8aee181..6e40c9f50 100644 --- a/server/src/main/java/edp/davinci/core/enums/DownloadType.java +++ b/server/src/main/java/edp/davinci/core/enums/DownloadType.java @@ -19,7 +19,13 @@ package edp.davinci.core.enums; - +/** + * Created by IntelliJ IDEA. + * + * @Author daemon + * @Date 19/5/28 10:57 + * To change this template use File | Settings | File Templates. + */ public enum DownloadType { Widget("widget"), DashBoard("dashboard"), diff --git a/server/src/main/java/edp/davinci/core/inteceptor/CurrentUserMethodArgumentResolver.java b/server/src/main/java/edp/davinci/core/inteceptor/CurrentUserMethodArgumentResolver.java index ce9085f26..087520607 100644 --- a/server/src/main/java/edp/davinci/core/inteceptor/CurrentUserMethodArgumentResolver.java +++ b/server/src/main/java/edp/davinci/core/inteceptor/CurrentUserMethodArgumentResolver.java @@ -19,18 +19,18 @@ package edp.davinci.core.inteceptor; -import org.apache.linkis.server.security.SecurityFilter; +import com.webank.wedatasphere.linkis.server.security.SecurityFilter; import edp.core.annotation.CurrentUser; +import edp.core.consts.Consts; import edp.core.inteceptor.CurrentUserMethodArgumentResolverInterface; import edp.davinci.dao.UserMapper; import edp.davinci.model.User; import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang.StringUtils; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.core.MethodParameter; import org.springframework.web.bind.support.WebDataBinderFactory; import org.springframework.web.context.request.NativeWebRequest; -import org.springframework.web.context.request.ServletWebRequest; +import org.springframework.web.context.request.RequestAttributes; import org.springframework.web.method.support.ModelAndViewContainer; import javax.servlet.http.HttpServletRequest; @@ -50,46 +50,13 @@ public boolean supportsParameter(MethodParameter parameter) { && parameter.hasParameterAnnotation(CurrentUser.class); } - /** - * 动机: - * 由于之前Visualis依赖于linkis_user表,存在极大的耦合, - * - * 解决方式: - * 新建一张visualis_user表,复用原来的权限逻辑, - * 如果访问Visualis时,使用该注解,没有该用户,即插入用户,录入用户信息。 - * - * 多个请求同时访问时,需要两步,查数据库和插数据库,这里需要性能优化。 - */ @Override public Object resolveArgument(MethodParameter parameter, ModelAndViewContainer mavContainer, NativeWebRequest webRequest, WebDataBinderFactory binderFactory) { - try { - String dssUser = (String) ((ServletWebRequest) webRequest).getRequest().getAttribute("dss-user"); - if (StringUtils.isNotBlank(dssUser)) { - return userMapper.selectByUsername(dssUser); - } - String accessUsername = SecurityFilter.getLoginUsername(webRequest.getNativeRequest(HttpServletRequest.class)); - log.info("Get request access user name: {}", accessUsername); - User visualisUser = null; - //to do! - visualisUser = (User) userMapper.selectByUsername(accessUsername); - if(null == visualisUser) { - synchronized (this) { - visualisUser = (User) userMapper.selectByUsername(accessUsername); - log.info("Get visualis user from table: {}", visualisUser); - User user = new User(); - if (null == visualisUser) { - user.setUsername(accessUsername); - user.setName(accessUsername); - user.setPassword(null); - log.info("Insert into visualis user: {}", user); - userMapper.insert(user); - return user; - } - } - } - return visualisUser; - } catch (Throwable e) { - log.error("Failed to get user: ", e); + try + { + return (User)userMapper.selectByUsername(SecurityFilter.getLoginUsername(webRequest.getNativeRequest(HttpServletRequest.class))); + }catch (Throwable e){ + log.error("Failed to get user:",e); throw e; } } diff --git a/server/src/main/java/edp/davinci/core/model/Criterion.java b/server/src/main/java/edp/davinci/core/model/Criterion.java index ac0b53cd4..ff5afd92e 100644 --- a/server/src/main/java/edp/davinci/core/model/Criterion.java +++ b/server/src/main/java/edp/davinci/core/model/Criterion.java @@ -52,7 +52,7 @@ public Criterion(String column, String operator, Object value, Object secondValu public boolean isNeedApostrophe(){ return !Arrays.stream(SqlFilter.NumericDataType.values()) - .filter(value -> this.dataType != null && this.dataType.equalsIgnoreCase(value.getType())).findFirst() + .filter(value -> this.dataType.equalsIgnoreCase(value.getType())).findFirst() .isPresent(); } diff --git a/server/src/main/java/edp/davinci/core/service/RedisMessageReceiver.java b/server/src/main/java/edp/davinci/core/service/RedisMessageReceiver.java index 70ac4586b..55f807ac3 100644 --- a/server/src/main/java/edp/davinci/core/service/RedisMessageReceiver.java +++ b/server/src/main/java/edp/davinci/core/service/RedisMessageReceiver.java @@ -37,7 +37,6 @@ public class RedisMessageReceiver { @Autowired private BeanFactory beanFactory; - @SuppressWarnings("unchecked") public void receive(RedisMessageEntity messageEntity) { if (messageEntity != null && messageEntity.getMessage() != null) { log.info("[ Redis ({}) received message, start handle......]", DAVINCI_TOPIC_CHANNEL); diff --git a/server/src/main/java/edp/davinci/core/utils/CsvUtils.java b/server/src/main/java/edp/davinci/core/utils/CsvUtils.java index 3b8abf2de..8f7ffc2c2 100644 --- a/server/src/main/java/edp/davinci/core/utils/CsvUtils.java +++ b/server/src/main/java/edp/davinci/core/utils/CsvUtils.java @@ -31,8 +31,6 @@ import org.apache.commons.csv.CSVParser; import org.apache.commons.csv.CSVPrinter; import org.apache.commons.csv.CSVRecord; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; import org.springframework.web.multipart.MultipartFile; import java.io.*; @@ -43,8 +41,6 @@ public class CsvUtils { - final static Logger log = LoggerFactory.getLogger(CsvUtils.class); - /** * 解析Csv @@ -106,7 +102,7 @@ public static DataUploadEntity parseCsvWithFirstAsHeader(MultipartFile csvFile, reader.close(); } catch (Exception e) { - log.error("CSV file parsing failed: ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } finally { try { @@ -118,7 +114,7 @@ public static DataUploadEntity parseCsvWithFirstAsHeader(MultipartFile csvFile, reader.close(); } } catch (IOException e) { - log.error("CSV file stream close failed: ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } } @@ -195,7 +191,7 @@ public static String formatCsvWithFirstAsHeader(String filePath, String fileName } } catch (Exception e) { - log.error("CSV file writing failed: ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } finally { try { @@ -204,6 +200,7 @@ public static String formatCsvWithFirstAsHeader(String filePath, String fileName fileWriter.close(); csvPrinter.close(); } catch (Exception e) { + e.printStackTrace(); throw new ServerException(e.getMessage()); } } diff --git a/server/src/main/java/edp/davinci/core/utils/DacChannelUtil.java b/server/src/main/java/edp/davinci/core/utils/DacChannelUtil.java index 32cb252c7..861fac544 100644 --- a/server/src/main/java/edp/davinci/core/utils/DacChannelUtil.java +++ b/server/src/main/java/edp/davinci/core/utils/DacChannelUtil.java @@ -131,7 +131,6 @@ public List getBizs(String dacName, String tenantId) throws NotFoundException { } - @SuppressWarnings("unchecked") public List getData(String dacName, String bizId, String email) { if (dacMap.containsKey(dacName) && !StringUtils.isEmpty(email)) { DacChannel channel = dacMap.get(dacName); diff --git a/server/src/main/java/edp/davinci/core/utils/ExcelUtils.java b/server/src/main/java/edp/davinci/core/utils/ExcelUtils.java index affea80ce..4a3a351bc 100644 --- a/server/src/main/java/edp/davinci/core/utils/ExcelUtils.java +++ b/server/src/main/java/edp/davinci/core/utils/ExcelUtils.java @@ -38,8 +38,6 @@ import org.apache.poi.ss.util.CellRangeAddress; import org.apache.poi.xssf.streaming.SXSSFWorkbook; import org.apache.poi.xssf.usermodel.XSSFWorkbook; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; import org.springframework.web.multipart.MultipartFile; import javax.script.Invocable; @@ -52,13 +50,11 @@ import java.util.stream.IntStream; import static edp.core.consts.Consts.*; -import static edp.davinci.common.utils.ScriptUtils.formatHeader; -import static edp.davinci.common.utils.ScriptUtils.getCellValueScriptEngine; +import static edp.davinci.common.utils.ScriptUtiils.formatHeader; +import static edp.davinci.common.utils.ScriptUtiils.getCellValueScriptEngine; public class ExcelUtils { - final static Logger log = LoggerFactory.getLogger(ExcelUtils.class); - /** * 解析上传Excel @@ -101,6 +97,7 @@ public static DataUploadEntity parseExcelWithFirstAsHeader(MultipartFile excelFi headers.add(new QueryColumn(headerRow.getCell(i).getStringCellValue(), SqlUtils.formatSqlType(typeRow.getCell(i).getStringCellValue()))); } catch (Exception e) { + e.printStackTrace(); if (e instanceof NullPointerException) { throw new ServerException("Unknown Type"); } @@ -124,6 +121,7 @@ public static DataUploadEntity parseExcelWithFirstAsHeader(MultipartFile excelFi dataUploadEntity.setValues(values); } catch (ServerException e) { + e.printStackTrace(); throw new ServerException(e.getMessage()); } @@ -144,6 +142,7 @@ private static Workbook getReadWorkbook(MultipartFile excelFile) throws ServerEx throw new ServerException("Invalid excel file"); } } catch (IOException e) { + e.printStackTrace(); throw new ServerException(e.getMessage()); } finally { try { @@ -151,6 +150,7 @@ private static Workbook getReadWorkbook(MultipartFile excelFile) throws ServerEx inputStream.close(); } } catch (IOException e) { + e.printStackTrace(); throw new ServerException(e.getMessage()); } } @@ -206,7 +206,7 @@ public static void writeSheet(Sheet sheet, engine = getCellValueScriptEngine(); excelHeaders = formatHeader(engine, json, params); } catch (Exception e) { - log.error("Failed to write excel sheet: ", e); + e.printStackTrace(); } } @@ -547,9 +547,9 @@ private static List> formatValue(ScriptEngine engine, List cutImage(String basePath, String scrImagePath, int cu ImageIO.write(cropImage, format.substring(1, format.length()), cropFile); log.info("image_{}", n); } catch (Exception e) { - log.info("crop image error: ", e); + e.printStackTrace(); + log.info("crop image error"); executorService.shutdownNow(); } finally { countDownLatch.countDown(); diff --git a/server/src/main/java/edp/davinci/core/utils/SqlParseUtils.java b/server/src/main/java/edp/davinci/core/utils/SqlParseUtils.java index 6be0faae8..5eef7d162 100644 --- a/server/src/main/java/edp/davinci/core/utils/SqlParseUtils.java +++ b/server/src/main/java/edp/davinci/core/utils/SqlParseUtils.java @@ -20,10 +20,7 @@ package edp.davinci.core.utils; import com.sun.tools.javac.util.ListBuffer; -import org.apache.linkis.entrance.interceptor.impl.CustomVariableUtils; -import org.apache.linkis.governance.common.entity.job.JobRequest; -import org.apache.linkis.manager.label.entity.Label; -import org.apache.linkis.manager.label.entity.engine.CodeLanguageLabel; +import com.webank.wedatasphere.linkis.entrance.interceptor.impl.CustomVariableUtils; import edp.core.consts.Consts; import edp.core.exception.ServerException; import edp.core.utils.CollectionUtils; @@ -134,7 +131,7 @@ public SqlEntity parseSql(String sqlStr, List variables, String sql } } catch (InterruptedException e) { - log.error("Thread interrupt for parsing SQL: ", e); + e.printStackTrace(); } finally { executorService.shutdown(); } @@ -199,38 +196,29 @@ public static String replaceParams(String sql, Map queryParamMap } } +// ST st = new ST(sql, delimiter, delimiter); +// if (!CollectionUtils.isEmpty(authParamMap) && !CollectionUtils.isEmpty(expSet)) { +// authParamMap.forEach((k, v) -> st.add(k, true)); +// } +// //替换query@var +// if (!CollectionUtils.isEmpty(queryParamMap)) { +// queryParamMap.forEach(st::add); +// } +// sql = st.render(); + //Linkis compatible Pattern queryP = Pattern.compile(REG_QUERYVAR); Matcher matcherQuery = queryP.matcher(sql); while (matcherQuery.find()) { String group = matcherQuery.group(); String key = StringUtils.substringBetween(group, "${", "}"); - if(queryParamMap.get(key) != null){ - sql = StringUtils.replace(sql, group, queryParamMap.getOrDefault(key, "").toString()); - } + sql = StringUtils.replace(sql, group, queryParamMap.getOrDefault(key, "").toString()); } - //linkis variable - sql = linkisVariabelReplace(sql, user.username); log.info("after variable substitution sql is {} ", sql); return sql; } - // Apache Linkis variable CustomVariableUtils - private static String linkisVariabelReplace(String sql, String username) { - JobRequest jobRequest = new JobRequest(); - - jobRequest.setExecutionCode(sql); - jobRequest.setExecuteUser(username); - CodeLanguageLabel codeLabel = new CodeLanguageLabel(); - codeLabel.setCodeType("sql"); - Map configMap = new HashMap<>(); - jobRequest.setParams(configMap); - - jobRequest.setLabels(Arrays.asList(codeLabel)); - return CustomVariableUtils.replaceCustomVar(jobRequest, "sql")._2; - } - public List getSqls(String sql, boolean isQuery) { sql = sql.trim(); @@ -282,7 +270,7 @@ private static Map getParsedExpression(Set expSet, Map getProjectsByKewordsWithUser(@Param("keywords") String keywords, @Param("userId") Long userId, @Param("orgList") List list); - @Select("select user_id from visualis_project p where id = #{projectId}") - Integer getProjectUserId(@Param("projectId") Long projectId); - @Select({"select id from visualis_project where org_id = #{orgId} and `name` = #{name}"}) + @Select({"select id from dss_project where org_id = #{orgId} and `name` = #{name}"}) Long getByNameWithOrgId(@Param("name") String name, @Param("orgId") Long orgId); int insert(Project project); - @Select({"select * from visualis_project where id = #{id}"}) + @Select({"select * from dss_project where id = #{id}"}) Project getById(@Param("id") Long id); ProjectDetail getProjectDetail(@Param("id") Long id); - @Select({"select * from visualis_project where id = #{id} and user_id = #{userId}"}) + @Select({"select * from dss_project where id = #{id} and user_id = #{userId}"}) Project getByProject(Project project); - @Update({"update visualis_project set description = #{description}, visibility = #{visibility}, update_time = #{updateTime} where id = #{id}"}) + @Update({"update dss_project set `name` = #{name}, description = #{description}, visibility = #{visibility}, update_time = #{updateTime}, update_by = #{updateBy} where id = #{id}"}) int updateBaseInfo(Project project); - @Update({"update visualis_project set `org_id` = #{orgId} where id = #{id}"}) + @Update({"update dss_project set `org_id` = #{orgId} where id = #{id}"}) int changeOrganization(Project project); - @Update({"update visualis_project set `is_transfer` = #{isTransfer, jdbcType=TINYINT} where id = #{id}"}) + @Update({"update dss_project set `is_transfer` = #{isTransfer, jdbcType=TINYINT} where id = #{id}"}) int changeTransferStatus(@Param("isTransfer") Boolean isTransfer, @Param("id") Long id); - @Delete({"delete from visualis_project where id = #{id}"}) + @Delete({"delete from dss_project where id = #{id}"}) int deleteById(@Param("id") Long id); - @Update({"update visualis_project set isArchive = 1 where id = #{id}"}) - int setProjectToArchive(@Param("id") Long id); - - @Select({"select * from visualis_project where org_id = #{orgId}"}) + @Select({"select * from dss_project where org_id = #{orgId}"}) List getByOrgId(@Param("orgId") Long orgId); - @Select({"SELECT p.* FROM visualis_project p INNER JOIN display d on p.id = d.project_id where d.id = #{displayId}"}) + @Select({"SELECT p.* FROM dss_project p INNER JOIN display d on p.id = d.project_id where d.id = #{displayId}"}) Project getByDisplayId(@Param("displayId") Long displayId); - @Update({"update visualis_project set star_num = star_num + 1 where id = #{id}"}) + @Update({"update dss_project set star_num = star_num + 1 where id = #{id}"}) int starNumAdd(@Param("id") Long id); - @Update({"update visualis_project set star_num = IF(star_num > 0,star_num - 1, 0) where id = #{id}"}) + @Update({"update dss_project set star_num = IF(star_num > 0,star_num - 1, 0) where id = #{id}"}) int starNumReduce(@Param("id") Long id); Set getProjectIdsByAdmin(@Param("userId") Long userId); @@ -95,25 +90,8 @@ public interface ProjectMapper { int deleteBeforOrgRole(@Param("projectId") Long projectId, @Param("orgId") Long orgId); @Select({ - "select * from visualis_project p", + "select * from dss_project p", "WHERE p.user_id= #{userId} AND p.name = #{name}" }) - List getProjectByNameWithUserId(@Param("name") String name, @Param("userId") Long userId); - - @Select("select `project_id` from widget where `id` = #{widgetId}") - Long getProjectIdByWidgetId(@Param("widgetId") Long widgetId); - - - @Select("select `project_id` from display where id = #{displayId}") - Long getProjectByDisplayId(@Param("displayId") Long displayId); - - - @Select("select project_id from dashboard_portal where id = #{dashboardId}") - Long getProjectIdByDashboardId(@Param("dashboardId") Long dashboardId); - - @Select("select project_id from view where id = #{viewId}") - Long getProjectIdByViewId(@Param("viewId") Long viewId); - - @Select("select * from visualis_project where `name` = #{keywords} limit 1") - Project getProjectByName(@Param("keywords") String keywords); + List getProjectByNameWithUserId(@Param("name")String name,@Param("userId")Long userId); } \ No newline at end of file diff --git a/server/src/main/java/edp/davinci/dao/RelUserOrganizationMapper.java b/server/src/main/java/edp/davinci/dao/RelUserOrganizationMapper.java index a426d7e08..621f131bd 100644 --- a/server/src/main/java/edp/davinci/dao/RelUserOrganizationMapper.java +++ b/server/src/main/java/edp/davinci/dao/RelUserOrganizationMapper.java @@ -47,7 +47,7 @@ public interface RelUserOrganizationMapper { "SELECT ruo.id, u.id AS 'user.id', ", " IF(u.`name` is NULL,u.username,u.`name`) AS 'user.username', ", " u.email, u.avatar AS 'user.avatar', ruo.role AS 'user.role'", - "FROM `visualis_user` u", + "FROM `linkis_user` u", "LEFT JOIN rel_user_organization ruo on ruo.user_id = u.id", "LEFT JOIN organization o on o.id = ruo.org_id", "WHERE ruo.org_id = #{orgId}" diff --git a/server/src/main/java/edp/davinci/dao/StarMapper.java b/server/src/main/java/edp/davinci/dao/StarMapper.java index 5f3e4983f..271f9cb4e 100644 --- a/server/src/main/java/edp/davinci/dao/StarMapper.java +++ b/server/src/main/java/edp/davinci/dao/StarMapper.java @@ -55,14 +55,14 @@ public interface StarMapper { @Select({ - "select p.*, u.id as 'createBy.id', IF(u.`name` is NULL,u.username,u.`name`) as 'createBy.username' from visualis_project p left join dss_user u on u.id = p.user_id ", + "select p.*, u.id as 'createBy.id', IF(u.`name` is NULL,u.username,u.`name`) as 'createBy.username', u.avatar as 'createBy.avatar' from dss_project p left join linkis_user u on u.id = p.user_id ", "where p.id in (select target_id from star where target = #{target} and user_id = #{userId})" }) List getStarProjectListByUser(@Param("userId") Long userId, @Param("target") String target); @Select({ - "select u.id, IF(u.`name` is NULL,u.username,u.`name`) as username, u.email, u.avatar, s.star_time from star s left join visualis_user u on u.id = s.user_id", + "select u.id, IF(u.`name` is NULL,u.username,u.`name`) as username, u.email, u.avatar, s.star_time from star s left join linkis_user u on u.id = s.user_id", "where s.target = #{target} and s.target_id = #{targetId}" }) List getStarUserListByTarget(@Param("targetId") Long targetId, @Param("target") String target); diff --git a/server/src/main/java/edp/davinci/dao/UserMapper.java b/server/src/main/java/edp/davinci/dao/UserMapper.java index 10de6f3fb..bb158e4f8 100644 --- a/server/src/main/java/edp/davinci/dao/UserMapper.java +++ b/server/src/main/java/edp/davinci/dao/UserMapper.java @@ -34,10 +34,10 @@ public interface UserMapper { int insert(User user); - @Select({"select * from `visualis_user` where id = #{id}"}) + @Select({"select * from `linkis_user` where id = #{id}"}) User getById(@Param("id") Long id); - @Select({"select * from `visualis_user` where `username` = #{username} or `email` = #{username} or `name` = #{username}"}) + @Select({"select * from `linkis_user` where `username` = #{username} or `email` = #{username} or `name` = #{username}"}) User selectByUsername(@Param("username") String username); @Select({"select * from `user` where `email` = #{email}"}) @@ -47,29 +47,29 @@ public interface UserMapper { List getUsersByKeyword(@Param("keyword") String keyword, @Param("orgId") Long orgId); - @Update({"update `visualis_user` set `name` = #{name}, description = #{description}, department = #{department}, update_time = #{updateTime}", + @Update({"update `linkis_user` set `name` = #{name}, description = #{description}, department = #{department}, update_time = #{updateTime}", "where id = #{id}"}) int updateBaseInfo(User user); - @Update({"update visualis_user set `avatar` = #{avatar}, update_time = #{updateTime} where id = #{id}"}) + @Update({"update linkis_user set `avatar` = #{avatar}, update_time = #{updateTime} where id = #{id}"}) int updateAvatar(User user); - @Select({"select id from visualis_user where (LOWER(`username`) = LOWER(#{name}) or LOWER(`email`) = LOWER(#{name}) or LOWER(`name`) = LOWER(#{name}))"}) + @Select({"select id from linkis_user where (LOWER(`username`) = LOWER(#{name}) or LOWER(`email`) = LOWER(#{name}) or LOWER(`name`) = LOWER(#{name}))"}) Long getIdByName(@Param("name") String name); - @Update({"update `visualis_user` set `active` = #{active}, `update_time` = #{updateTime} where id = #{id}"}) + @Update({"update `linkis_user` set `active` = #{active}, `update_time` = #{updateTime} where id = #{id}"}) int activeUser(User user); - @Update({"update `visualis_user` set `password` = #{password}, `update_time` = #{updateTime} where id = #{id}"}) + @Update({"update `linkis_user` set `password` = #{password}, `update_time` = #{updateTime} where id = #{id}"}) int changePassword(User user); List getByIds(@Param("userIds") List userIds); - @Select({"select count(id) from `visualis_user` where `email` = #{email}"}) + @Select({"select count(id) from `linkis_user` where `email` = #{email}"}) boolean existEmail(@Param("email") String email); - @Select({"select count(id) from `visualis_user` where `username` = #{username}"}) + @Select({"select count(id) from `linkis_user` where `username` = #{username}"}) boolean existUsername(@Param("username") String username); } \ No newline at end of file diff --git a/server/src/main/java/edp/davinci/dao/ViewMapper.java b/server/src/main/java/edp/davinci/dao/ViewMapper.java index 2f696bc16..52d0dacec 100644 --- a/server/src/main/java/edp/davinci/dao/ViewMapper.java +++ b/server/src/main/java/edp/davinci/dao/ViewMapper.java @@ -24,7 +24,10 @@ import edp.davinci.dto.viewDto.ViewWithSource; import edp.davinci.dto.viewDto.ViewWithSourceBaseInfo; import edp.davinci.model.View; -import org.apache.ibatis.annotations.*; +import org.apache.ibatis.annotations.Delete; +import org.apache.ibatis.annotations.Param; +import org.apache.ibatis.annotations.Select; +import org.apache.ibatis.annotations.Update; import org.springframework.stereotype.Component; import java.util.List; @@ -32,7 +35,7 @@ @Component public interface ViewMapper { - @Options(useGeneratedKeys = true, keyProperty = "id") + int insert(View view); @Select({"select id from `view` where project_id = #{projectId} and `name` = #{name}"}) @@ -48,8 +51,21 @@ public interface ViewMapper { @Select({"select * from `view` where id = #{id}"}) View getById(Long id); - List getByIds(@Param("list") Set ids); + @Update({ + "update `view`", + "set `name` = #{name,jdbcType=VARCHAR},", + "`description` = #{description,jdbcType=VARCHAR},", + "`project_id` = #{projectId,jdbcType=BIGINT},", + "`source_id` = #{sourceId,jdbcType=BIGINT},", + "`sql` = #{sql,jdbcType=LONGVARCHAR},", + "`model` = #{model,jdbcType=LONGVARCHAR},", + "`variable` = #{variable,jdbcType=LONGVARCHAR},", + "`config` = #{config,jdbcType=LONGVARCHAR},", + "`update_by` = #{updateBy,jdbcType=BIGINT},", + "`update_time` = #{updateTime,jdbcType=TIMESTAMP}", + "where id = #{id,jdbcType=BIGINT}" + }) int update(View view); @Select({"select * from `view` where source_id = #{sourceId}"}) @@ -96,6 +112,4 @@ public interface ViewMapper { Set selectByWidgetIds(@Param("widgetIds") Set widgetIds); - @Select({"select * from `view` where project_id = #{projectId}"}) - List getByProject(@Param("projectId") Long projectId); } \ No newline at end of file diff --git a/server/src/main/java/edp/davinci/dto/projectDto/ProjectUpdate.java b/server/src/main/java/edp/davinci/dto/projectDto/ProjectUpdate.java index 1d40f362c..322438dd9 100644 --- a/server/src/main/java/edp/davinci/dto/projectDto/ProjectUpdate.java +++ b/server/src/main/java/edp/davinci/dto/projectDto/ProjectUpdate.java @@ -28,9 +28,8 @@ @NotNull(message = "project cannot be null") public class ProjectUpdate { - // 项目不允许更新时间 -// @NotBlank(message = "project name cannot be EMPTY") -// private String name; + @NotBlank(message = "project name cannot be EMPTY") + private String name; private String description; diff --git a/server/src/main/java/edp/davinci/dto/userDto/UserBaseInfo.java b/server/src/main/java/edp/davinci/dto/userDto/UserBaseInfo.java index 48f51e12b..3531c00c8 100644 --- a/server/src/main/java/edp/davinci/dto/userDto/UserBaseInfo.java +++ b/server/src/main/java/edp/davinci/dto/userDto/UserBaseInfo.java @@ -27,8 +27,7 @@ public class UserBaseInfo { String username; - // 用户头像路径 - String avatar = null; + String avatar; String email; } diff --git a/server/src/main/java/edp/davinci/dto/viewDto/DistinctParam.java b/server/src/main/java/edp/davinci/dto/viewDto/DistinctParam.java index c11146b14..c04ccffce 100644 --- a/server/src/main/java/edp/davinci/dto/viewDto/DistinctParam.java +++ b/server/src/main/java/edp/davinci/dto/viewDto/DistinctParam.java @@ -19,7 +19,6 @@ package edp.davinci.dto.viewDto; -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; import lombok.Data; import javax.validation.constraints.NotEmpty; @@ -34,8 +33,6 @@ public class DistinctParam { private List filters; - private VirtualView view; - private List params; private Boolean cache; diff --git a/server/src/main/java/edp/davinci/dto/viewDto/ViewCreate.java b/server/src/main/java/edp/davinci/dto/viewDto/ViewCreate.java index e0017f1a5..a01833eb3 100644 --- a/server/src/main/java/edp/davinci/dto/viewDto/ViewCreate.java +++ b/server/src/main/java/edp/davinci/dto/viewDto/ViewCreate.java @@ -39,7 +39,7 @@ public class ViewCreate { @Min(value = 1, message = "Invalid project Id") private Long projectId; - @Min(value = 0, message = "Invalid source Id") + @Min(value = 1, message = "Invalid source Id") private Long sourceId; private String sql; @@ -52,10 +52,9 @@ public class ViewCreate { private List roles; - public ViewCreate() { - } + public ViewCreate(){} - public ViewCreate(String name, String description, Long projectId, Long sourceId, String sql, String model, String config) { + public ViewCreate(String name,String description,Long projectId,Long sourceId,String sql,String model,String config){ this.name = name; this.description = description; this.projectId = projectId; diff --git a/server/src/main/java/edp/davinci/dto/viewDto/ViewExecuteParam.java b/server/src/main/java/edp/davinci/dto/viewDto/ViewExecuteParam.java index a562556e8..dd036dd4c 100644 --- a/server/src/main/java/edp/davinci/dto/viewDto/ViewExecuteParam.java +++ b/server/src/main/java/edp/davinci/dto/viewDto/ViewExecuteParam.java @@ -20,7 +20,6 @@ package edp.davinci.dto.viewDto; import com.alibaba.druid.util.StringUtils; -import com.webank.wedatasphere.dss.visualis.query.model.VirtualView; import edp.core.utils.CollectionUtils; import edp.core.utils.SqlUtils; import lombok.Data; @@ -40,7 +39,6 @@ public class ViewExecuteParam { private List aggregators; private List orders; private List filters; - private VirtualView view; private List params; private Boolean cache; private Long expired; @@ -52,32 +50,9 @@ public class ViewExecuteParam { private boolean nativeQuery = false; - private String chartType = ""; - private String engineType = ""; - public ViewExecuteParam() { } - public ViewExecuteParam(List groupList, - List aggregators, - List orders, - List filterList, - VirtualView view, - List params, - Boolean cache, - Long expired, - Boolean nativeQuery) { - this.groups = groupList; - this.aggregators = aggregators; - this.orders = orders; - this.filters = filterList; - this.view = view; - this.params = params; - this.cache = cache; - this.expired = expired; - this.nativeQuery = nativeQuery; - } - public ViewExecuteParam(List groupList, List aggregators, List orders, @@ -130,17 +105,17 @@ public List getOrders(String jdbcUrl, String dbVersion) { String column = order.getColumn().trim(); Matcher matcher = PATTERN_SQL_AGGREGATE.matcher(order.getColumn().trim().toLowerCase()); if (!matcher.find()) { -// String prefix = SqlUtils.getKeywordPrefix(jdbcUrl, dbVersion); -// String suffix = SqlUtils.getKeywordSuffix(jdbcUrl, dbVersion); -// StringBuilder columnBuilder = new StringBuilder(); -// if (!column.startsWith(prefix)) { -// columnBuilder.append(prefix); -// } -// columnBuilder.append(column); -// if (!column.endsWith(suffix)) { -// columnBuilder.append(suffix); -// } - order.setColumn(column); + String prefix = SqlUtils.getKeywordPrefix(jdbcUrl, dbVersion); + String suffix = SqlUtils.getKeywordSuffix(jdbcUrl, dbVersion); + StringBuilder columnBuilder = new StringBuilder(); + if (!column.startsWith(prefix)) { + columnBuilder.append(prefix); + } + columnBuilder.append(column); + if (!column.endsWith(suffix)) { + columnBuilder.append(suffix); + } + order.setColumn(columnBuilder.toString()); } list.add(order); } diff --git a/server/src/main/java/edp/davinci/dto/viewDto/ViewWithProject.java b/server/src/main/java/edp/davinci/dto/viewDto/ViewWithProject.java deleted file mode 100644 index ed11b9897..000000000 --- a/server/src/main/java/edp/davinci/dto/viewDto/ViewWithProject.java +++ /dev/null @@ -1,10 +0,0 @@ -package edp.davinci.dto.viewDto; - -import edp.davinci.model.Project; -import edp.davinci.model.View; -import lombok.Data; - -@Data -public class ViewWithProject extends View { - Project project; -} diff --git a/server/src/main/java/edp/davinci/dto/viewDto/ViewWithSourceBaseInfo.java b/server/src/main/java/edp/davinci/dto/viewDto/ViewWithSourceBaseInfo.java index f31b5f073..07ebc0c64 100644 --- a/server/src/main/java/edp/davinci/dto/viewDto/ViewWithSourceBaseInfo.java +++ b/server/src/main/java/edp/davinci/dto/viewDto/ViewWithSourceBaseInfo.java @@ -31,110 +31,4 @@ public class ViewWithSourceBaseInfo extends View { private SourceBaseInfo source; private List roles; - - private Long id; - - private String name; - - private String description; - - private Long projectId; - - private Long sourceId; - - private String sql; - - private String model; - - private String variable; - - private String config; - - public SourceBaseInfo getSource() { - return source; - } - - public void setSource(SourceBaseInfo source) { - this.source = source; - } - - public List getRoles() { - return roles; - } - - public void setRoles(List roles) { - this.roles = roles; - } - - public Long getId() { - return id; - } - - public void setId(Long id) { - this.id = id; - } - - public String getName() { - return name; - } - - public void setName(String name) { - this.name = name; - } - - public String getDescription() { - return description; - } - - public void setDescription(String description) { - this.description = description; - } - - public Long getProjectId() { - return projectId; - } - - public void setProjectId(Long projectId) { - this.projectId = projectId; - } - - public Long getSourceId() { - return sourceId; - } - - public void setSourceId(Long sourceId) { - this.sourceId = sourceId; - } - - public String getSql() { - return sql; - } - - public void setSql(String sql) { - this.sql = sql; - } - - public String getModel() { - return model; - } - - public void setModel(String model) { - this.model = model; - } - - public String getVariable() { - return variable; - } - - public void setVariable(String variable) { - this.variable = variable; - } - - public String getConfig() { - return config; - } - - public void setConfig(String config) { - this.config = config; - } } diff --git a/server/src/main/java/edp/davinci/model/Config.java b/server/src/main/java/edp/davinci/model/Config.java deleted file mode 100644 index 28297869f..000000000 --- a/server/src/main/java/edp/davinci/model/Config.java +++ /dev/null @@ -1,15 +0,0 @@ -package edp.davinci.model; - -import com.alibaba.fastjson.annotation.JSONField; -import lombok.Data; - -@Data -public class Config { - private Long id; - private String key; - private String value; - private String scope; - private String username; - @JSONField(serialize = false) - private String params; -} diff --git a/server/src/main/java/edp/davinci/service/DisplayService.java b/server/src/main/java/edp/davinci/service/DisplayService.java index d84a724ee..8f8e7d77d 100644 --- a/server/src/main/java/edp/davinci/service/DisplayService.java +++ b/server/src/main/java/edp/davinci/service/DisplayService.java @@ -39,8 +39,11 @@ public interface DisplayService extends CheckEntityService { SlideWithMem getDisplaySlideMem(Long displayId, Long slideId, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; Display createDisplay(DisplayInfo displayInfo, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; - +//======= Display copyDisplay(DisplayInfo displayInfo, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; +// +// ResultMap createDisplay(DisplayInfo displayInfo, User user, HttpServletRequest request); +//>>>>>>> drawis boolean updateDisplay(DisplayUpdate displayUpdate, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; diff --git a/server/src/main/java/edp/davinci/service/ProjectService.java b/server/src/main/java/edp/davinci/service/ProjectService.java index 5a0adcfd1..7f595ab5c 100644 --- a/server/src/main/java/edp/davinci/service/ProjectService.java +++ b/server/src/main/java/edp/davinci/service/ProjectService.java @@ -43,8 +43,6 @@ public interface ProjectService extends CheckEntityService { Project updateProject(Long id, ProjectUpdate projectUpdate, User user) throws ServerException, UnAuthorizedExecption, NotFoundException; - boolean setProjectToArchive(Long id, User user) throws ServerException, UnAuthorizedExecption, NotFoundException; - boolean deleteProject(Long id, User user) throws ServerException, UnAuthorizedExecption, NotFoundException; Project transferPeoject(Long id, Long orgId, User user) throws ServerException, UnAuthorizedExecption, NotFoundException; @@ -74,6 +72,4 @@ public interface ProjectService extends CheckEntityService { List getAdmins(Long id, User user) throws NotFoundException, UnAuthorizedExecption; boolean isMaintainer(ProjectDetail projectDetail, User user); - - Project checkProjectName(String keywords); } diff --git a/server/src/main/java/edp/davinci/service/ShareService.java b/server/src/main/java/edp/davinci/service/ShareService.java index e02daf283..48a565de2 100644 --- a/server/src/main/java/edp/davinci/service/ShareService.java +++ b/server/src/main/java/edp/davinci/service/ShareService.java @@ -49,7 +49,7 @@ public interface ShareService { ShareDashboard getShareDashboard(String token, User user) throws NotFoundException, ServerException, ForbiddenExecption, UnAuthorizedExecption; - Paginate> getShareData(String token, ViewExecuteParam executeParam, User user, HttpServletRequest request) throws NotFoundException, ServerException, ForbiddenExecption, UnAuthorizedExecption, SQLException; + Paginate> getShareData(String token, ViewExecuteParam executeParam, User user) throws NotFoundException, ServerException, ForbiddenExecption, UnAuthorizedExecption, SQLException; String generationShareDataCsv(ViewExecuteParam executeParam, User user, String token) throws NotFoundException, ServerException, ForbiddenExecption, UnAuthorizedExecption; diff --git a/server/src/main/java/edp/davinci/service/SourceService.java b/server/src/main/java/edp/davinci/service/SourceService.java index e285d414b..8558f835b 100644 --- a/server/src/main/java/edp/davinci/service/SourceService.java +++ b/server/src/main/java/edp/davinci/service/SourceService.java @@ -61,6 +61,4 @@ public interface SourceService extends CheckEntityService { List getDatasources(); boolean reconnect(Long id, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; - - List getAvailableEngineTypes(String username); } diff --git a/server/src/main/java/edp/davinci/service/ViewService.java b/server/src/main/java/edp/davinci/service/ViewService.java index 9ceed4e02..49c10a6d1 100644 --- a/server/src/main/java/edp/davinci/service/ViewService.java +++ b/server/src/main/java/edp/davinci/service/ViewService.java @@ -19,14 +19,12 @@ package edp.davinci.service; -import com.webank.wedatasphere.dss.visualis.model.PaginateWithExecStatus; import edp.core.exception.NotFoundException; import edp.core.exception.ServerException; import edp.core.exception.UnAuthorizedExecption; import edp.core.model.Paginate; import edp.core.model.PaginateWithQueryColumns; import edp.davinci.core.service.CheckEntityService; -import edp.davinci.dto.sourceDto.SourceWithProject; import edp.davinci.dto.viewDto.*; import edp.davinci.model.User; import edp.davinci.service.excel.SQLContext; @@ -39,34 +37,32 @@ public interface ViewService extends CheckEntityService { List getViews(Long projectId, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; - ViewWithSourceBaseInfo createView(ViewCreate viewCreate, User user, String ticketId) throws NotFoundException, UnAuthorizedExecption, ServerException; + ViewWithSourceBaseInfo createView(ViewCreate viewCreate, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; boolean updateView(ViewUpdate viewUpdate, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; + boolean deleteView(Long id, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; +//======= +// ResultMap updateViewColumns(ViewUpdate viewUpdate, User user, HttpServletRequest request); +// +// ResultMap deleteView(Long id, User user, HttpServletRequest request); +//>>>>>>> drawis PaginateWithQueryColumns executeSql(ViewExecuteSql executeSql, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; - //异步执行语句 - PaginateWithExecStatus AsyncSubmitSql(ViewExecuteSql executeSql, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; - - Paginate> getData(Long id, ViewExecuteParam executeParam, User user, boolean async) throws NotFoundException, UnAuthorizedExecption, ServerException, SQLException; - - Paginate> getAsyncProgress(String execId, User user) throws Exception; + Paginate> getData(Long id, ViewExecuteParam executeParam, User user) throws NotFoundException, UnAuthorizedExecption, ServerException, SQLException; - Paginate> killAsyncJob(String execId, User user) throws Exception; - - Paginate> getAsyncResult(String execId, User user) throws Exception; - - PaginateWithQueryColumns getResultDataList(boolean isMaintainer, ViewWithSource viewWithSource, ViewExecuteParam executeParam, User user, boolean async) throws ServerException, SQLException; + PaginateWithQueryColumns getResultDataList(boolean isMaintainer, ViewWithSource viewWithSource, ViewExecuteParam executeParam, User user) throws ServerException, SQLException; List> getDistinctValue(Long id, DistinctParam param, User user) throws NotFoundException, ServerException, UnAuthorizedExecption; +//======= +// List> getResultDataList(ViewWithProjectAndSource viewWithProjectAndSource, ViewExecuteParam executeParam, User user,String sharedUser) throws ServerException; +//>>>>>>> drawis List getDistinctValueData(boolean isMaintainer, ViewWithSource viewWithSource, DistinctParam param, User user) throws ServerException; ViewWithSourceBaseInfo getView(Long id, User user) throws NotFoundException, UnAuthorizedExecption, ServerException; SQLContext getSQLContext(boolean isMaintainer, ViewWithSource viewWithSource, ViewExecuteParam executeParam, User user); - - SourceWithProject getDefaultSourceWithProject(Long sourceId, User user); } diff --git a/server/src/main/java/edp/davinci/service/elastic/ElasticConfigration.java b/server/src/main/java/edp/davinci/service/elastic/ElasticConfigration.java index b55f48f30..01bdd3264 100644 --- a/server/src/main/java/edp/davinci/service/elastic/ElasticConfigration.java +++ b/server/src/main/java/edp/davinci/service/elastic/ElasticConfigration.java @@ -25,7 +25,6 @@ public class ElasticConfigration { @Autowired public Environment environment; - @SuppressWarnings("unchecked") @PostConstruct public void initialize() throws Exception { String statistic_open = environment.getProperty("statistic.enable"); diff --git a/server/src/main/java/edp/davinci/service/excel/AbstractSheetWriter.java b/server/src/main/java/edp/davinci/service/excel/AbstractSheetWriter.java index 2d37a33e6..26d45e8ba 100644 --- a/server/src/main/java/edp/davinci/service/excel/AbstractSheetWriter.java +++ b/server/src/main/java/edp/davinci/service/excel/AbstractSheetWriter.java @@ -61,11 +61,11 @@ public abstract class AbstractSheetWriter { private int nextRowNum = 0; //用于记录表头对应数据格式 - Map headerFormatMap = new HashMap<>(); + Map headerFormatMap = new HashMap(); //用于标记标记数字格式单位 - Map dataUnitMap = new HashMap<>(); + Map dataUnitMap = new HashMap(); //记录列最大字符数 - Map columnWidthMap = new HashMap<>(); + Map columnWidthMap = new HashMap(); protected void init(SheetContext context) throws Exception { diff --git a/server/src/main/java/edp/davinci/service/excel/ExecutorUtil.java b/server/src/main/java/edp/davinci/service/excel/ExecutorUtil.java index 7f1eb6af2..dd4933bb1 100644 --- a/server/src/main/java/edp/davinci/service/excel/ExecutorUtil.java +++ b/server/src/main/java/edp/davinci/service/excel/ExecutorUtil.java @@ -41,7 +41,6 @@ public class ExecutorUtil { new ThreadFactoryBuilder().setNameFormat("sheet-worker-%d").setDaemon(true).build()); - @SuppressWarnings("unchecked") public static Future submitWorkbookTask(WorkbookWorker worker) { printThreadPoolStatusLog(WORKBOOK_WORKERS, "WORKBOOK_WORKERS"); return ExecutorUtil.WORKBOOK_WORKERS.submit(worker); @@ -51,7 +50,6 @@ public static Future submitWorkbookTask(WorkBookContext context) { return ExecutorUtil.submitWorkbookTask(new WorkbookWorker(context)); } - @SuppressWarnings("unchecked") public static Future submitSheetTask(SheetWorker worker) { printThreadPoolStatusLog(SHEET_WORKERS, "SHEET_WORKERS"); return ExecutorUtil.SHEET_WORKERS.submit(worker); diff --git a/server/src/main/java/edp/davinci/service/excel/SheetWorker.java b/server/src/main/java/edp/davinci/service/excel/SheetWorker.java index 75f15f12c..a5ca1c3aa 100644 --- a/server/src/main/java/edp/davinci/service/excel/SheetWorker.java +++ b/server/src/main/java/edp/davinci/service/excel/SheetWorker.java @@ -55,7 +55,6 @@ public SheetWorker(SheetContext context) { this.context = context; } - @SuppressWarnings("unchecked") @Override public T call() throws Exception { Stopwatch watch = Stopwatch.createStarted(); diff --git a/server/src/main/java/edp/davinci/service/excel/WorkbookWorker.java b/server/src/main/java/edp/davinci/service/excel/WorkbookWorker.java index 873fdfb01..0c4d22f15 100644 --- a/server/src/main/java/edp/davinci/service/excel/WorkbookWorker.java +++ b/server/src/main/java/edp/davinci/service/excel/WorkbookWorker.java @@ -24,7 +24,7 @@ import edp.core.utils.CollectionUtils; import edp.core.utils.FileUtils; import edp.core.utils.SqlUtils; -import edp.davinci.common.utils.ScriptUtils; +import edp.davinci.common.utils.ScriptUtiils; import edp.davinci.core.config.SpringContextHolder; import edp.davinci.core.enums.ActionEnum; import edp.davinci.core.enums.FileTypeEnum; @@ -65,7 +65,6 @@ public WorkbookWorker(WorkBookContext context) { } - @SuppressWarnings("unchecked") @Override public T call() throws Exception { Stopwatch watch = Stopwatch.createStarted(); @@ -143,7 +142,7 @@ private List buildSheetContextList() throws Exception { if (context.isHasExecuteParam() && null != context.getExecuteParam()) { executeParam = context.getExecuteParam(); } else { - executeParam = ScriptUtils.getViewExecuteParam(ScriptUtils.getExecuptParamScriptEngine(), + executeParam = ScriptUtiils.getViewExecuteParam(ScriptUtiils.getExecuptParamScriptEngine(), context.getDashboard() != null ? context.getDashboard().getConfig() : null, context.getWidget().getConfig(), context.getMemDashboardWidget() != null ? context.getMemDashboardWidget().getId() : null); @@ -158,7 +157,7 @@ private List buildSheetContextList() throws Exception { boolean isTable; List excelHeaders = null; if (isTable = ExcelUtils.isTable(context.getWidget().getConfig())) { - excelHeaders = ScriptUtils.formatHeader(ScriptUtils.getCellValueScriptEngine(), context.getWidget().getConfig(), + excelHeaders = ScriptUtiils.formatHeader(ScriptUtiils.getCellValueScriptEngine(), context.getWidget().getConfig(), sqlContext.getViewExecuteParam().getParams()); } SheetContext sheetContext = SheetContext.newSheetContextBuilder() diff --git a/server/src/main/java/edp/davinci/service/impl/BuriedPointsServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/BuriedPointsServiceImpl.java index 41259cc3b..4d2d2b7e8 100644 --- a/server/src/main/java/edp/davinci/service/impl/BuriedPointsServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/BuriedPointsServiceImpl.java @@ -146,7 +146,7 @@ public static List> EntityConvertMap(List list){ l.add(map); } } catch (Exception e) { - log.error("Entity to map error: ", e); + e.printStackTrace(); } return l; } diff --git a/server/src/main/java/edp/davinci/service/impl/CronJobServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/CronJobServiceImpl.java index e4a182d5f..29ce2a0ca 100755 --- a/server/src/main/java/edp/davinci/service/impl/CronJobServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/CronJobServiceImpl.java @@ -208,7 +208,8 @@ public boolean updateCronJob(CronJobUpdate cronJobUpdate, User user) throws NotF quartzUtils.removeJob(cronJob); cronJob.setJobStatus(CronJobStatusEnum.FAILED.getStatus()); cronJobMapper.update(cronJob); - log.error("Failed to modify cron job: ", e); + + e.printStackTrace(); } return true; @@ -330,7 +331,7 @@ public CronJob stopCronJob(Long id, User user) throws NotFoundException, UnAutho try { countDownLatch.await(15L, TimeUnit.SECONDS); } catch (InterruptedException e) { - log.error("Failed to stop cron job: ", e); + e.printStackTrace(); } finally { countDownLatch.countDown(); } @@ -347,7 +348,7 @@ public CronJob stopCronJob(Long id, User user) throws NotFoundException, UnAutho cronJob.setJobStatus(CronJobStatusEnum.FAILED.getStatus()); cronJobMapper.update(cronJob); - log.error("Failed to stop cron job: ", e); + e.printStackTrace(); return cronJob; } } diff --git a/server/src/main/java/edp/davinci/service/impl/DashboardPortalServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/DashboardPortalServiceImpl.java index e9f560135..aa08526ec 100644 --- a/server/src/main/java/edp/davinci/service/impl/DashboardPortalServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/DashboardPortalServiceImpl.java @@ -23,7 +23,6 @@ import edp.core.exception.ServerException; import edp.core.exception.UnAuthorizedExecption; import edp.core.utils.CollectionUtils; -import edp.davinci.common.utils.ComponentFilterUtils; import edp.davinci.core.enums.LogNameEnum; import edp.davinci.core.enums.UserPermissionEnum; import edp.davinci.core.enums.VizEnum; @@ -121,9 +120,6 @@ public List getDashboardPortals(Long projectId, User user) thro } - ComponentFilterUtils filter = new ComponentFilterUtils(); - dashboardPortals = filter.doFilterDashboardPortal(dashboardPortals); - return dashboardPortals; } diff --git a/server/src/main/java/edp/davinci/service/impl/DashboardServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/DashboardServiceImpl.java index fea2fcbec..a756736cb 100644 --- a/server/src/main/java/edp/davinci/service/impl/DashboardServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/DashboardServiceImpl.java @@ -21,7 +21,6 @@ import com.alibaba.druid.util.StringUtils; import com.alibaba.fastjson.JSON; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; import edp.core.exception.NotFoundException; import edp.core.exception.ServerException; import edp.core.exception.UnAuthorizedExecption; @@ -77,9 +76,6 @@ public class DashboardServiceImpl extends VizCommonService implements DashboardS @Autowired private ShareService shareService; - @Autowired - private ProjectAuth projectAuth; - @Override public synchronized boolean isExist(String name, Long id, Long portalId) { @@ -219,11 +215,6 @@ public Dashboard createDashboard(DashboardCreate dashboardCreate, User user) thr throw new NotFoundException("the dashboard portal is not found"); } - - if(!projectAuth.isPorjectOwner(dashboardPortal.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectDetail projectDetail = projectService.getProjectDetail(dashboardPortal.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); diff --git a/server/src/main/java/edp/davinci/service/impl/DisplayServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/DisplayServiceImpl.java index b2d906e63..21b97cfd8 100644 --- a/server/src/main/java/edp/davinci/service/impl/DisplayServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/DisplayServiceImpl.java @@ -21,13 +21,11 @@ import com.alibaba.druid.util.StringUtils; import com.alibaba.fastjson.JSONObject; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; import edp.core.exception.NotFoundException; import edp.core.exception.ServerException; import edp.core.exception.UnAuthorizedExecption; import edp.core.utils.CollectionUtils; import edp.core.utils.FileUtils; -import edp.davinci.common.utils.ComponentFilterUtils; import edp.davinci.core.common.Constants; import edp.davinci.core.enums.LogNameEnum; import edp.davinci.core.enums.UserPermissionEnum; @@ -79,10 +77,6 @@ public class DisplayServiceImpl extends VizCommonService implements DisplayServi @Autowired private RelRoleDisplaySlideWidgetMapper relRoleDisplaySlideWidgetMapper; - - @Autowired - private ProjectAuth projectAuth; - @Override public synchronized boolean isExist(String name, Long id, Long projectId) { Long displayId = displayMapper.getByNameWithProjectId(name, projectId); @@ -214,10 +208,6 @@ public boolean deleteDisplay(Long id, User user) throws NotFoundException, UnAut return true; } - if(!projectAuth.isPorjectOwner(displayWithProject.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectDetail projectDetail = projectService.getProjectDetail(displayWithProject.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); @@ -274,11 +264,6 @@ public boolean deleteDisplaySlide(Long slideId, User user) throws NotFoundExcept throw new NotFoundException("display is not found"); } - - if(!projectAuth.isPorjectOwner(display.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectPermission projectPermission = projectService.getProjectPermission(projectService.getProjectDetail(display.getProjectId(), user, false), user); List disableDisplays = getDisableVizs(user.getId(), display.getProjectId(), null, VizEnum.DISPLAY); @@ -326,10 +311,6 @@ public boolean updateDisplay(DisplayUpdate displayUpdate, User user) throws NotF throw new NotFoundException("display is not found"); } - if(!projectAuth.isPorjectOwner(display.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectDetail projectDetail = projectService.getProjectDetail(display.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); @@ -399,10 +380,6 @@ public DisplaySlide createDisplaySlide(DisplaySlideCreate displaySlideCreate, Us throw new NotFoundException("display is not found"); } - if(!projectAuth.isPorjectOwner(display.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectPermission projectPermission = projectService.getProjectPermission(projectService.getProjectDetail(display.getProjectId(), user, false), user); List disableDisplays = getDisableVizs(user.getId(), display.getProjectId(), null, VizEnum.DISPLAY); @@ -460,10 +437,6 @@ public boolean updateDisplaySildes(Long displayId, DisplaySlide[] displaySlides, throw new NotFoundException("display is not found"); } - if(!projectAuth.isPorjectOwner(display.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectPermission projectPermission = projectService.getProjectPermission(projectService.getProjectDetail(display.getProjectId(), user, false), user); List disableDisplays = getDisableVizs(user.getId(), display.getProjectId(), null, VizEnum.DISPLAY); @@ -819,9 +792,6 @@ public List getDisplayListByProject(Long projectId, User user) throws N } } - ComponentFilterUtils filter = new ComponentFilterUtils(); - displays = filter.doFilterDisplays(displays); - return displays; } @@ -899,7 +869,6 @@ public DisplayWithSlides getDisplaySlideList(Long displayId, User user) throws N * @param user * @return */ - // display id 467, slid id 462 @Override public SlideWithMem getDisplaySlideMem(Long displayId, Long slideId, User user) throws NotFoundException, UnAuthorizedExecption, ServerException { @@ -1024,7 +993,7 @@ public String uploadAvatar(MultipartFile file) throws ServerException { try { avatar = fileUtils.upload(file, Constants.DISPLAY_AVATAR_PATH, fileName); } catch (Exception e) { - log.error("Failed to upload picture: ", e); + e.printStackTrace(); throw new ServerException("display cover picture upload error"); } @@ -1108,7 +1077,7 @@ public String uploadSlideBGImage(Long slideId, MultipartFile file, User user) th jsonObject.put("slideParams", slideParams); } } catch (Exception e) { - log.error("display slide background upload error: ", e); + e.printStackTrace(); throw new ServerException("display slide background upload error"); } @@ -1185,7 +1154,7 @@ public String uploadSlideSubWidgetBGImage(Long relationId, MultipartFile file, U } jsonObject.put(key, background); } catch (Exception e) { - log.error("display slide sub widget backgroundImage upload error: ", e); + e.printStackTrace(); throw new ServerException("display slide sub widget backgroundImage upload error"); } diff --git a/server/src/main/java/edp/davinci/service/impl/DownloadServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/DownloadServiceImpl.java index 507383d29..5daddf679 100644 --- a/server/src/main/java/edp/davinci/service/impl/DownloadServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/DownloadServiceImpl.java @@ -92,7 +92,6 @@ public DownloadRecord downloadById(Long id, String token) throws UnAuthorizedExe return record; } - @SuppressWarnings("unchecked") @Override public Boolean submit(DownloadType type, Long id, User user, List params) { try { diff --git a/server/src/main/java/edp/davinci/service/impl/EmailScheduleServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/EmailScheduleServiceImpl.java index 7fe220dc9..ba0e4132b 100644 --- a/server/src/main/java/edp/davinci/service/impl/EmailScheduleServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/EmailScheduleServiceImpl.java @@ -64,8 +64,8 @@ import static edp.core.consts.Consts.EMPTY; import static edp.core.consts.Consts.SEMICOLON; -import static edp.davinci.common.utils.ScriptUtils.getExecuptParamScriptEngine; -import static edp.davinci.common.utils.ScriptUtils.getViewExecuteParam; +import static edp.davinci.common.utils.ScriptUtiils.getExecuptParamScriptEngine; +import static edp.davinci.common.utils.ScriptUtiils.getViewExecuteParam; @Slf4j @Service("emailScheduleService") @@ -211,7 +211,6 @@ private List generateImages(long jobId, CronJobConfig cronJobConfi @Override public List getPreviewImage(Long userId, String contentType, Long contentId) throws Exception{ List imageContents = new ArrayList<>(); - // 只截一张图片 int order = 0; String url = getContentUrl(userId, contentType, contentId); imageContents.add(new ImageContent(order, contentId, contentType, url)); @@ -255,7 +254,6 @@ public String getContentUrl(Long userId, String contentType, Long contengId) { * @return * @throws Exception */ - @SuppressWarnings("unchecked") private List generateExcels(CronJobConfig cronJobConfig, User user) throws Exception { ScriptEngine engine = getExecuptParamScriptEngine(); @@ -325,7 +323,7 @@ private List generateExcels(CronJobConfig cronJobConfig, User user excelContents.add(new ExcelContent(name, msgMailExcel.getFilePath())); countDownLatch.countDown(); } catch (InterruptedException e) { - log.error(e.getMessage()); + e.printStackTrace(); } finally { lock.unlock(); } diff --git a/server/src/main/java/edp/davinci/service/impl/LdapServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/LdapServiceImpl.java index 0d373c078..83ff4e4e7 100644 --- a/server/src/main/java/edp/davinci/service/impl/LdapServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/LdapServiceImpl.java @@ -112,7 +112,7 @@ public LdapPerson findByUsername(String username, String password) { ldapPerson = search.get(0); } } catch (Exception e) { - log.error("LDAP lookup user failed: ", e); + e.printStackTrace(); } finally { if (null != ctx) { LdapUtils.closeContext(ctx); diff --git a/server/src/main/java/edp/davinci/service/impl/OrganizationServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/OrganizationServiceImpl.java index 13306752c..e09fcd00f 100644 --- a/server/src/main/java/edp/davinci/service/impl/OrganizationServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/OrganizationServiceImpl.java @@ -191,6 +191,7 @@ public Map uploadAvatar(Long id, MultipartFile file, User user) } } catch (Exception e) { log.error("uploadAvatar: organization({}) avatar upload error, error: {}", organization.getName(), e.getMessage()); + e.printStackTrace(); throw new ServerException("organization avatar upload error"); } @@ -377,7 +378,8 @@ public void inviteMember(Long orgId, Long memId, User user) throws NotFoundExcep Constants.INVITE_ORG_MEMBER_MAIL_TEMPLATE, content); } catch (ServerException e) { - log.error(e.getMessage()); + log.info(e.getMessage()); + e.printStackTrace(); } } diff --git a/server/src/main/java/edp/davinci/service/impl/ProjectServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/ProjectServiceImpl.java index 1d373212f..6867c7037 100644 --- a/server/src/main/java/edp/davinci/service/impl/ProjectServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/ProjectServiceImpl.java @@ -41,7 +41,6 @@ import edp.davinci.service.DisplayService; import edp.davinci.service.ProjectService; import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang.StringUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.BeanUtils; @@ -111,11 +110,11 @@ public boolean isExist(String name, Long id, Long scopeId) { @Override public synchronized boolean isExist(String name, Long id, Long orgId, Long userId) { - if (isExist(name, id, orgId)) { + if(isExist(name, id, orgId)){ return true; } Project project = Iterables.getFirst(projectMapper.getProjectByNameWithUserId(name, userId), null); - if (project != null) { + if(project != null){ if (null != id && null != project.getId()) { return !id.equals(project.getId()); } @@ -171,13 +170,13 @@ public PageInfo searchProjects(String keywords, User user, throw new ServerException("Invalid page info"); } -// List orgs = organizationMapper.getOrganizationByUser(user.getId()); -// if (CollectionUtils.isEmpty(orgs)) { -// throw new UnAuthorizedExecption(); -// } + List orgs = organizationMapper.getOrganizationByUser(user.getId()); + if (CollectionUtils.isEmpty(orgs)) { + throw new UnAuthorizedExecption(); + } PageHelper.startPage(pageNum, pageSize); - List projects = projectMapper.getProjectsByKewordsWithUser(keywords, user.getId(), null); + List projects = projectMapper.getProjectsByKewordsWithUser(keywords, user.getId(), orgs); PageInfo pageInfo = new PageInfo<>(projects); return pageInfo; } @@ -205,7 +204,7 @@ public ProjectInfo createProject(ProjectCreat projectCreat, User user) throws Se // throw new NotFoundException("not found organization"); // } - if (organization != null) { + if(organization != null){ RelUserOrganization rel = relUserOrganizationMapper.getRel(user.getId(), organization.getId()); if (rel != null && rel.getRole() != UserOrgRoleEnum.OWNER.getRole() && !organization.getAllowCreateProject()) { log.info("project are not allowed to be created under the organization named {}", organization.getName()); @@ -222,7 +221,7 @@ public ProjectInfo createProject(ProjectCreat projectCreat, User user) throws Se if (insert > 0) { optLogger.info("project ({}) is create by user(:{})", project.toString(), user.getId()); - if (organization != null) { + if(organization != null){ organization.setProjectNum(organization.getProjectNum() + 1); organizationMapper.updateProjectNum(organization); } @@ -309,22 +308,6 @@ public Project transferPeoject(Long id, Long orgId, User user) throws ServerExce } } - @Override - public boolean setProjectToArchive(Long id, User user) throws ServerException, UnAuthorizedExecption, NotFoundException { - - ProjectDetail project = getProjectDetail(id, user, true); - - // 对项目进行归档处理 - int i = projectMapper.setProjectToArchive(id); - if(i > 0) { - log.warn("The project: {} has been set to archive status", id); - return true; - } else { - log.error("delete project: {} fail", id); - throw new ServerException("delete project fail: unspecified error"); - } - } - /** * 删除project * @@ -397,12 +380,11 @@ public Project updateProject(Long id, ProjectUpdate projectUpdate, User user) th String originInfo = project.baseInfoToString(); -// project.setName(projectUpdate.getName()); -// project.setVisibility(projectUpdate.getVisibility()); -// project.setUpdateBy(user.getId()); - // 兼容DSS更新项目,只支持修改备注和记录更新时间 + project.setName(projectUpdate.getName()); project.setDescription(projectUpdate.getDescription()); + project.setVisibility(projectUpdate.getVisibility()); project.setUpdateTime(new Date()); + project.setUpdateBy(user.getId()); int i = projectMapper.updateBaseInfo(project); if (i > 0) { @@ -571,8 +553,7 @@ public ProjectDetail getProjectDetail(Long id, User user, boolean modify) throws throw new NotFoundException("project is not found"); } - //updated for dss project sharing - boolean isCreater = true;//projectDetail.getUserId().equals(user.getId()) && !projectDetail.getIsTransfer(); + boolean isCreater = projectDetail.getUserId().equals(user.getId()) && !projectDetail.getIsTransfer(); RelUserOrganization rel = relUserOrganizationMapper.getRel(user.getId(), projectDetail.getOrgId()); RelProjectAdmin relProjectAdmin = relProjectAdminMapper.getByProjectAndUser(id, user.getId()); @@ -775,48 +756,37 @@ public List getAdmins(Long id, User user) throws NotFoundExc * @return */ public boolean isMaintainer(ProjectDetail projectDetail, User user) { - return true; - //updated for dss project sharing -// if (null == projectDetail || null == user) { -// return false; -// } -// -// //当前project的creater -// if (projectDetail.getUserId().equals(user.getId()) && !projectDetail.getIsTransfer()) { -// return true; -// } -// -// //project所在org的creater -// if (projectDetail.getOrganization().getUserId().equals(user.getId())) { -// return true; -// } -// -// //project 所在org的owner -// RelUserOrganization orgRel = relUserOrganizationMapper.getRel(user.getId(), projectDetail.getOrgId()); -// if (null == orgRel) { -// return false; -// } -// -// if (orgRel.getRole() == UserOrgRoleEnum.OWNER.getRole()) { -// return true; -// } -// -// //project 的admin -// RelProjectAdmin projectAdmin = relProjectAdminMapper.getByProjectAndUser(projectDetail.getId(), user.getId()); -// if (null != projectAdmin) { -// return true; -// } -// -// return false; - } + if (null == projectDetail || null == user) { + return false; + } - @Override - public Project checkProjectName(String keywords) { - if (StringUtils.isEmpty(keywords)) { - throw new NotFoundException("keywords is empty when check project name"); + //当前project的creater + if (projectDetail.getUserId().equals(user.getId()) && !projectDetail.getIsTransfer()) { + return true; + } + + //project所在org的creater + if (projectDetail.getOrganization().getUserId().equals(user.getId())) { + return true; + } + + //project 所在org的owner + RelUserOrganization orgRel = relUserOrganizationMapper.getRel(user.getId(), projectDetail.getOrgId()); + if (null == orgRel) { + return false; } - Project project = projectMapper.getProjectByName(keywords); - //工程结果为空时,设置projectId为-1,防止报空指针 - return null == project ? new Project(-1L, -1L) : project; + + if (orgRel.getRole() == UserOrgRoleEnum.OWNER.getRole()) { + return true; + } + + //project 的admin + RelProjectAdmin projectAdmin = relProjectAdminMapper.getByProjectAndUser(projectDetail.getId(), user.getId()); + if (null != projectAdmin) { + return true; + } + + return false; } + } diff --git a/server/src/main/java/edp/davinci/service/impl/ShareDownloadServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/ShareDownloadServiceImpl.java index 916634193..fdb70d991 100644 --- a/server/src/main/java/edp/davinci/service/impl/ShareDownloadServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/ShareDownloadServiceImpl.java @@ -50,7 +50,6 @@ public class ShareDownloadServiceImpl extends DownloadCommonService implements S @Autowired private ShareService shareService; - @SuppressWarnings("unchecked") @Override public boolean submit(DownloadType downloadType, String uuid, String token, User user, List params) { ShareInfo shareInfo = shareService.getShareInfo(token, user); diff --git a/server/src/main/java/edp/davinci/service/impl/ShareServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/ShareServiceImpl.java index 705871ca8..05786cc8c 100644 --- a/server/src/main/java/edp/davinci/service/impl/ShareServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/ShareServiceImpl.java @@ -21,8 +21,7 @@ import com.alibaba.druid.util.StringUtils; import com.alibaba.fastjson.JSONObject; -import com.webank.wedatasphere.dss.visualis.query.utils.EnvLimitUtils; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; +import com.webank.wedatasphere.linkis.server.security.SecurityFilter; import edp.core.enums.HttpCodeEnum; import edp.core.exception.ForbiddenExecption; import edp.core.exception.NotFoundException; @@ -49,7 +48,6 @@ import edp.davinci.model.*; import edp.davinci.service.ProjectService; import edp.davinci.service.ShareService; -import edp.davinci.service.SourceService; import edp.davinci.service.ViewService; import lombok.extern.slf4j.Slf4j; import org.mindrot.jbcrypt.BCrypt; @@ -91,18 +89,12 @@ public class ShareServiceImpl implements ShareService { @Autowired private ViewMapper viewMapper; - @Autowired - private SourceMapper sourceMapper; - @Autowired private ParamsMapper paramsMapper; @Autowired private ViewService viewService; - @Autowired - private SourceService sourceService; - @Autowired private MemDisplaySlideWidgetMapper memDisplaySlideWidgetMapper; @@ -202,7 +194,6 @@ public ShareWidget getShareWidget(String token, User user) throws NotFoundExcept String dateToken = generateShareToken(shareWidget.getId(), shareInfo.getSharedUserName(), shareInfo.getShareUser().getId()); shareWidget.setDataToken(dateToken); - dateToken = dateToken + "@@" + "widget" + "@@" + shareWidget.getId(); return shareWidget; } @@ -279,7 +270,6 @@ public ShareDisplay getShareDisplay(String token, User user) throws NotFoundExce while (widgetIterator.hasNext()) { ShareWidget shareWidget = widgetIterator.next(); String dateToken = generateShareToken(shareWidget.getId(), shareInfo.getSharedUserName(), shareInfo.getShareUser().getId()); - dateToken = dateToken + "@@" + "display" + "@@" + display.getId(); shareWidget.setDataToken(dateToken); } shareDisplay.setWidgets(shareWidgets); @@ -327,7 +317,6 @@ public ShareDashboard getShareDashboard(String token, User user) throws NotFound while (iterator.hasNext()) { ShareWidget shareWidget = iterator.next(); String dateToken = generateShareToken(shareWidget.getId(), shareInfo.getSharedUserName(), shareInfo.getShareUser().getId()); - dateToken = dateToken + "@@" + "dashboard" + "@@" + dashboard.getId(); shareWidget.setDataToken(dateToken); } } @@ -343,13 +332,9 @@ public ShareDashboard getShareDashboard(String token, User user) throws NotFound * @param user * @return */ - @SuppressWarnings("unchecked") @Override - public Paginate> getShareData(String token, ViewExecuteParam executeParam, User user, HttpServletRequest request) + public Paginate> getShareData(String token, ViewExecuteParam executeParam, User user) throws NotFoundException, ServerException, ForbiddenExecption, UnAuthorizedExecption, SQLException { - - String[] tokens = token.split("@@"); - token = tokens[0]; ShareInfo shareInfo = getShareInfo(token, user); if (null == shareInfo || shareInfo.getShareId().longValue() < 1L) { throw new ServerException("Invalid share token"); @@ -363,53 +348,73 @@ public Paginate> getShareData(String token, ViewExecuteParam ViewWithProjectAndSource viewWithProjectAndSource = viewMapper.getViewWithProjectAndSourceByWidgetId(shareInfo.getShareId()); - if(EnvLimitUtils.isProdEnv()){ - executeParam.setEngineType(VisualisUtils.SPARK().getValue()); - } - if(org.apache.commons.lang.StringUtils.isNotBlank(executeParam.getEngineType())){ - String dataSourceName = VisualisUtils.getDataSourceName(executeParam.getEngineType()) + "DataSource"; - if(!dataSourceName.equalsIgnoreCase(viewWithProjectAndSource.getSource().getName())){ - Long realDataSourceId = sourceMapper.getByNameWithProjectId(dataSourceName, viewWithProjectAndSource.getProjectId()); - Source realDataSource = sourceMapper.getById(realDataSourceId); - if(realDataSource == null){ - List sources = sourceService.getSources(viewWithProjectAndSource.getProjectId(), user, ""); - for (Source source : sources) { - if(source.getName().contains(dataSourceName)){ - realDataSource = source; - } - } - } - viewWithProjectAndSource.setSource(realDataSource); - } - } - ProjectDetail projectDetail = projectService.getProjectDetail(viewWithProjectAndSource.getProjectId(), shareInfo.getShareUser(), false); boolean maintainer = projectService.isMaintainer(projectDetail, shareInfo.getShareUser()); - Long viewId = viewWithProjectAndSource.getId(); - // replace user self-defined params - String uuid = request.getParameter("parameters"); - if (!StringUtils.isEmpty(uuid) && tokens.length == 3) { - Params params = paramsMapper.getByUuid(uuid); - params.setParamDetails(JSONObject.parseArray(params.getParams(), ParamsDetail.class)); - String type = tokens[1]; - Long contentId = Long.parseLong(tokens[2]); - for (ParamsDetail detail : params.getParamDetails()) { - if (detail.getViewId().equals(viewId)) { - if ("dashboard".equals(type) && contentId.equals(detail.getDashboardId())) { - executeParam.setParams(detail.getVariables()); - } else if ("widget".equals(type) && contentId.equals(detail.getWidgetId())) { - executeParam.setParams(detail.getVariables()); - } - - } - } - } - - Paginate paginate = viewService.getResultDataList(maintainer, viewWithProjectAndSource, executeParam, user, true); + //TODO parameters replacement +//======= +// Long viewId = shareInfo.getShareId(); +// ViewWithProjectAndSource viewWithProjectAndSource = viewMapper.getViewWithProjectAndSourceById(viewId); +// viewWithProjectAndSource = updateViewWithProjectAndSource(viewWithProjectAndSource,request); +// +// // replace user self-defined params +// String uuid = request.getParameter("parameters"); +// if(!StringUtils.isEmpty(uuid) && tokens.length == 3){ +// Params params = paramsMapper.getByUuid(uuid); +// params.setParamDetails(JSONObject.parseArray(params.getParams(), ParamsDetail.class)); +// String type = tokens[1]; +// Long contentId = Long.parseLong(tokens[2]); +// for(ParamsDetail detail : params.getParamDetails()){ +// if(detail.getViewId().equals(viewId)){ +// if("dashboard".equals(type) && contentId.equals(detail.getDashboardId())){ +// executeParam.setParams(detail.getVariables()); +// } else if("widget".equals(type) && contentId.equals(detail.getWidgetId())) { +// executeParam.setParams(detail.getVariables()); +// } +// +// } +// } +// } +// +// String sharedUser = SecurityFilter.getLoginUsername(request); +// list = viewService.getResultDataList(viewWithProjectAndSource, executeParam, shareInfo.getShareUser(),sharedUser); +//// if(VGUtils.isHiveDataSource(viewWithProjectAndSource.getSource())){ +//// //if it slows down the query, change it to async scheduled job +//// sessionQueryThrottle.syncParameter(user.username); +//// final ViewWithProjectAndSource fViewWithProjectAndSource = viewWithProjectAndSource; +//// list = sessionQueryThrottle.controlQuery(new Callable>>() { +//// @Override +//// public List> call() throws Exception { +//// return viewService.getResultDataList(fViewWithProjectAndSource, executeParam, shareInfo.getShareUser(),sharedUser); +//// } +//// }, user.username); +//// } else { +//// list = viewService.getResultDataList(viewWithProjectAndSource, executeParam, shareInfo.getShareUser(),sharedUser); +//// } +// } catch (ServerException e) { +// return resultFail(user, request, null).message(e.getMessage()); +// } catch (UnAuthorizedExecption e) { +// return resultFail(user, request, HttpCodeEnum.FORBIDDEN).message(e.getMessage()); +// } +//>>>>>>> drawis + + Paginate paginate = viewService.getResultDataList(maintainer, viewWithProjectAndSource, executeParam, shareInfo.getShareUser()); return paginate; } + /** + * 在source的config中加入是否是调度任务参数,在sqlUtils init 时进行相应的赋值 + * @param viewWithProjectAndSource + * @param request + * @return + */ + private ViewWithProjectAndSource updateViewWithProjectAndSource(ViewWithProjectAndSource viewWithProjectAndSource,HttpServletRequest request){ + JSONObject config = JSONObject.parseObject(viewWithProjectAndSource.getSource().getConfig()); + config.put("isSchedulerTask",true); + viewWithProjectAndSource.getSource().setConfig(config.toJSONString()); + return viewWithProjectAndSource; + } + /** * 分享数据生成csv文件并下载 @@ -448,9 +453,9 @@ public String generationShareDataCsv(ViewExecuteParam executeParam, User user, S PaginateWithQueryColumns paginate = null; try { boolean maintainer = projectService.isMaintainer(projectDetail, shareInfo.getShareUser()); - paginate = viewService.getResultDataList(maintainer, viewWithSource, executeParam, shareInfo.getShareUser(), false); + paginate = viewService.getResultDataList(maintainer, viewWithSource, executeParam, shareInfo.getShareUser()); } catch (SQLException e) { - log.error("failed to get result data list ", e); + e.printStackTrace(); throw new ServerException(HttpCodeEnum.SERVER_ERROR.getMessage()); } List columns = paginate.getColumns(); @@ -480,7 +485,6 @@ public String generationShareDataCsv(ViewExecuteParam executeParam, User user, S * @param request * @return */ - @SuppressWarnings("unchecked") @Override public ResultMap getDistinctValue(String token, Long viewId, DistinctParam param, User user, HttpServletRequest request) { List> list = null; diff --git a/server/src/main/java/edp/davinci/service/impl/SourceServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/SourceServiceImpl.java index 275616831..73823f8f1 100644 --- a/server/src/main/java/edp/davinci/service/impl/SourceServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/SourceServiceImpl.java @@ -21,11 +21,9 @@ import com.alibaba.fastjson.JSONObject; import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; -import com.webank.wedatasphere.dss.visualis.utils.HiveDBHelper; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; import edp.core.common.jdbc.JdbcDataSource; +import com.webank.wedatasphere.dss.visualis.service.hive.HiveDBHelper; +import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; import edp.core.enums.DataTypeEnum; import edp.core.exception.NotFoundException; import edp.core.exception.ServerException; @@ -40,13 +38,11 @@ import edp.davinci.core.model.DataUploadEntity; import edp.davinci.core.utils.CsvUtils; import edp.davinci.core.utils.ExcelUtils; -import edp.davinci.dao.ConfigMapper; import edp.davinci.dao.SourceMapper; import edp.davinci.dao.ViewMapper; import edp.davinci.dto.projectDto.ProjectDetail; import edp.davinci.dto.projectDto.ProjectPermission; import edp.davinci.dto.sourceDto.*; -import edp.davinci.model.Config; import edp.davinci.model.Source; import edp.davinci.model.User; import edp.davinci.model.View; @@ -92,9 +88,6 @@ public class SourceServiceImpl implements SourceService { @Autowired private ViewMapper viewMapper; - @Autowired - private ConfigMapper configMapper; - @Autowired private ProjectService projectService; @@ -104,9 +97,6 @@ public class SourceServiceImpl implements SourceService { @Autowired HiveDBHelper hiveDBHelper; - @Autowired - private ProjectAuth projectAuth; - @Override public synchronized boolean isExist(String name, Long id, Long projectId) { @@ -144,20 +134,13 @@ public List getSources(Long projectId, User user, String ticketId) throw sources = null; } } - if(sources.stream().noneMatch(s -> VisualisUtils.isLinkisDataSource(s))){ + if(CollectionUtils.isEmpty(sources)){ Source hiveSource = sourceMapper.getById(VisualisUtils.getHiveDataSourceId()); hiveSource.setId(null); hiveSource.setProjectId(projectId); sourceMapper.insert(hiveSource); totalSources.add(hiveDBHelper.sourceToHiveSource(hiveSource)); } - if(getAvailableEngineTypes(user.username).contains(VisualisUtils.PRESTO().getValue()) && sources.stream().noneMatch(s -> VisualisUtils.isPrestoDataSource(s))){ - Source prestoSource = sourceMapper.getById(VisualisUtils.getPrestoDataSourceId()); - prestoSource.setId(null); - prestoSource.setProjectId(projectId); - sourceMapper.insert(prestoSource); - totalSources.add(hiveDBHelper.sourceToHiveSource(prestoSource)); - } return totalSources; } @@ -215,21 +198,15 @@ public Source createSource(SourceCreate sourceCreate, User user) throws NotFound //测试连接 SourceConfig config = sourceCreate.getConfig(); - checkWhitelist(config); - boolean testConnection; - try{ - testConnection = sqlUtils - .init( - config.getUrl(), - config.getUsername(), - config.getPassword(), - config.getVersion(), - config.isExt() - ).testConnection(); - } catch (Exception e){ - throw new ServerException("failed to connect to database: " + e.getMessage(), e); - } + boolean testConnection = sqlUtils + .init( + config.getUrl(), + config.getUsername(), + config.getPassword(), + config.getVersion(), + config.isExt() + ).testConnection(); if (testConnection) { Source source = new Source().createdBy(user.getId()); @@ -248,15 +225,6 @@ public Source createSource(SourceCreate sourceCreate, User user) throws NotFound } } - private void checkWhitelist(SourceConfig config) { - if((Boolean) CommonConfig.ENABLE_JDBC_WHITELIST().getValue()){ - String ipPort = org.apache.commons.lang.StringUtils.substringBetween(config.getUrl(), "//", "?"); - if(!CommonConfig.JDBC_WHITELIST().getValue().contains(ipPort)){ - throw new ServerException("JDBC URL not in whitelist"); - } - } - } - /** * 修改source * @@ -273,10 +241,6 @@ public Source updateSource(SourceInfo sourceInfo, User user) throws NotFoundExce throw new NotFoundException("this source is not found"); } - if(!projectAuth.isPorjectOwner(source.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectDetail projectDetail = projectService.getProjectDetail(source.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); @@ -291,22 +255,15 @@ public Source updateSource(SourceInfo sourceInfo, User user) throws NotFoundExce } SourceConfig sourceConfig = sourceInfo.getConfig(); - checkWhitelist(sourceConfig); //测试连接 - boolean testConnection; - try{ - testConnection = sqlUtils - .init( - sourceConfig.getUrl(), - sourceConfig.getUsername(), - sourceConfig.getPassword(), - sourceConfig.getVersion(), - sourceConfig.isExt() - ).testConnection(); - } catch (Exception e){ - throw new ServerException("failed to connect to database: " + e.getMessage(), e); - } - + boolean testConnection = sqlUtils + .init( + sourceConfig.getUrl(), + sourceConfig.getUsername(), + sourceConfig.getPassword(), + sourceConfig.getVersion(), + sourceConfig.isExt() + ).testConnection(); if (testConnection) { String origin = source.toString(); @@ -345,11 +302,6 @@ public boolean deleteSrouce(Long id, User user) throws NotFoundException, UnAuth throw new NotFoundException("this source is not found"); } - - if(!projectAuth.isPorjectOwner(source.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectDetail projectDetail = projectService.getProjectDetail(source.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); @@ -390,12 +342,6 @@ public boolean testSource(SourceTest sourceTest) throws ServerException { sourceTest.setVersion(null); sourceTest.setExt(false); } - if((Boolean) CommonConfig.ENABLE_JDBC_WHITELIST().getValue()){ - String ipPort = org.apache.commons.lang.StringUtils.substringBetween(sourceTest.getUrl(), "//", "?"); - if(!CommonConfig.JDBC_WHITELIST().getValue().contains(ipPort)){ - throw new ServerException("JDBC URL not in whitelist"); - } - } testConnection = sqlUtils .init( sourceTest.getUrl(), @@ -553,7 +499,7 @@ public List getSourceDbs(Long id, User user, String ticketId) throws Not List dbList = null; - if(VisualisUtils.isLinkisDataSource(source)){ + if(VisualisUtils.isHiveDataSource(source)){ dbList = hiveDBHelper.getHiveDBNames(ticketId); } else { try { @@ -598,7 +544,7 @@ public DBTables getSourceTables(Long id, String dbName, User user, String ticket List tableList = null; - if(VisualisUtils.isLinkisDataSource(source)){ + if(VisualisUtils.isHiveDataSource(source)){ tableList = hiveDBHelper.getHiveTables(dbName, ticketId); } else { try { @@ -642,13 +588,13 @@ public TableInfo getTableInfo(Long id, String dbName, String tableName, User use ProjectDetail projectDetail = projectService.getProjectDetail(source.getProjectId(), user, false); TableInfo tableInfo = null; - if(VisualisUtils.isLinkisDataSource(source)){ + if(VisualisUtils.isHiveDataSource(source)){ tableInfo = hiveDBHelper.getHiveTableInfo(dbName, tableName, ticketId); } else { try { tableInfo = sqlUtils.init(source).getTableInfo(dbName, tableName); } catch (SourceException e) { - log.error("Error getting table data information for source: ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } } @@ -689,15 +635,6 @@ public boolean reconnect(Long id, User user) throws NotFoundException, UnAuthori return sqlUtils.init(source).testConnection(); } - @Override - public List getAvailableEngineTypes(String username) { - Config config = configMapper.getConfig("presto.enabled", "USER", username); - if(config == null || !Boolean.parseBoolean(config.getValue())){ - return Lists.newArrayList(VisualisUtils.SPARK().getValue()); - } - return Lists.newArrayList(VisualisUtils.AVAILABLE_ENGINE_TYPES().getValue().split(";")); - } - /** * 建表 * @@ -790,7 +727,7 @@ private void insertData(Set headers, List> valu } } } catch (ServerException e) { - log.error("Insert data error: ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } @@ -850,10 +787,10 @@ private void executeInsert(String tableName, Set headers, List 0) { //添加成功,发送激活邮件 - Map content = new HashMap<>(); + Map content = new HashMap(); content.put("username", user.getUsername()); content.put("host", serverUtils.getHost()); content.put("token", AESUtils.encrypt(tokenUtils.generateContinuousToken(user), null)); @@ -339,7 +338,7 @@ public boolean sendMail(String email, User user) throws ServerException { throw new ServerException("The current email address is not match user email address"); } - Map content = new HashMap<>(); + Map content = new HashMap(); content.put("username", user.getUsername()); content.put("host", serverUtils.getHost()); content.put("token", AESUtils.encrypt(tokenUtils.generateContinuousToken(user), null)); @@ -408,6 +407,7 @@ public ResultMap uploadAvatar(User user, MultipartFile file, HttpServletRequest } } catch (Exception e) { log.error("user avatar upload error, username: {}, error: {}", user.getUsername(), e.getMessage()); + e.printStackTrace(); return resultMap.failAndRefreshToken(request).message("user avatar upload error"); } diff --git a/server/src/main/java/edp/davinci/service/impl/ViewServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/ViewServiceImpl.java index c9a65bb01..a552e6e97 100644 --- a/server/src/main/java/edp/davinci/service/impl/ViewServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/ViewServiceImpl.java @@ -23,29 +23,21 @@ import com.alibaba.fastjson.JSON; import com.alibaba.fastjson.JSONArray; import com.alibaba.fastjson.JSONObject; -import com.google.common.collect.Lists; import com.google.common.collect.Maps; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; -import com.webank.wedatasphere.dss.visualis.model.PaginateWithExecStatus; -import com.webank.wedatasphere.dss.visualis.query.utils.ChartUtils; -import com.webank.wedatasphere.dss.visualis.query.utils.EnvLimitUtils; -import com.webank.wedatasphere.dss.visualis.query.utils.JdbcAsyncUtils; -import com.webank.wedatasphere.dss.visualis.ujes.UJESJob; -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; -import org.apache.linkis.adapt.LinkisUtils; -import org.apache.linkis.entrance.utils.JobHistoryHelper; -import org.apache.linkis.protocol.query.cache.CacheTaskResult; import edp.core.consts.Consts; import edp.core.exception.NotFoundException; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import com.webank.wedatasphere.dss.visualis.model.DWCResultInfo; +import com.webank.wedatasphere.dss.visualis.ujes.UJESJob; +import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils; import edp.core.exception.ServerException; import edp.core.exception.UnAuthorizedExecption; -import edp.core.model.BaseSource; import edp.core.model.Paginate; import edp.core.model.PaginateWithQueryColumns; import edp.core.utils.CollectionUtils; +import edp.core.utils.MD5Util; import edp.core.utils.RedisUtils; import edp.core.utils.SqlUtils; -import edp.davinci.common.utils.ComponentFilterUtils; import edp.davinci.core.common.Constants; import edp.davinci.core.enums.LogNameEnum; import edp.davinci.core.enums.SqlVariableTypeEnum; @@ -65,12 +57,10 @@ import edp.davinci.dto.viewDto.*; import edp.davinci.model.*; import edp.davinci.service.ProjectService; -import edp.davinci.service.SourceService; import edp.davinci.service.ViewService; import edp.davinci.service.excel.SQLContext; import lombok.extern.slf4j.Slf4j; -import org.apache.commons.io.FilenameUtils; -import org.apache.commons.lang.math.NumberUtils; +import org.apache.lucene.analysis.tokenattributes.PackedTokenAttributeImpl; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.BeanUtils; @@ -88,6 +78,7 @@ import java.util.stream.Collectors; import static edp.core.consts.Consts.COMMA; +import static edp.core.consts.Consts.MINUS; import static edp.davinci.core.common.Constants.N0_AUTH_PERMISSION; import static edp.davinci.core.enums.SqlVariableTypeEnum.AUTHVARE; import static edp.davinci.core.enums.SqlVariableTypeEnum.QUERYVAR; @@ -119,18 +110,12 @@ public class ViewServiceImpl implements ViewService { @Autowired private ProjectService projectService; - @Autowired - private SourceService sourceService; - @Autowired private SqlParseUtils sqlParseUtils; @Value("${sql_template_delimiter:$}") private String sqlTempDelimiter; - @Autowired - private ProjectAuth projectAuth; - private static final String SQL_VARABLE_KEY = "name"; @Override @@ -172,9 +157,6 @@ public List getViews(Long projectId, User user) throws NotFoundExc } } - ComponentFilterUtils filter = new ComponentFilterUtils(); - views = filter.doFilterViews(views); - return views; } @@ -248,7 +230,7 @@ public SQLContext getSQLContext(boolean isMaintainer, ViewWithSource viewWithSou */ @Override @Transactional - public ViewWithSourceBaseInfo createView(ViewCreate viewCreate, User user, String ticketId) throws NotFoundException, UnAuthorizedExecption, ServerException { + public ViewWithSourceBaseInfo createView(ViewCreate viewCreate, User user) throws NotFoundException, UnAuthorizedExecption, ServerException { ProjectDetail projectDetail = projectService.getProjectDetail(viewCreate.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); @@ -260,26 +242,19 @@ public ViewWithSourceBaseInfo createView(ViewCreate viewCreate, User user, Strin log.info("the view {} name is already taken", viewCreate.getName()); throw new ServerException("the view name is already taken"); } - Long sourceId = viewCreate.getSourceId(); - Source source; - if (sourceId == 0) { - source = createViewByDss(viewCreate, user, ticketId); - } else { - source = sourceMapper.getById(sourceId); - if (null == source) { - log.info("source (:{}) not found", sourceId); - throw new NotFoundException("source is not found"); - } + Source source = sourceMapper.getById(viewCreate.getSourceId()); + if (null == source) { + log.info("source (:{}) not found", viewCreate.getSourceId()); + throw new NotFoundException("source is not found"); } - /** *update by johnnwang * 如果为hive数据源则直接保存 */ - if (VisualisUtils.isLinkisDataSource(source)) { - return createViewMethod(viewCreate, source); + if(VisualisUtils.isHiveDataSource(source)){ + return createViewMethod(viewCreate,source); } //测试连接 boolean testConnection = sqlUtils.init(source).testConnection(); @@ -310,43 +285,15 @@ public ViewWithSourceBaseInfo createView(ViewCreate viewCreate, User user, Strin } } - /** - * DSS创建view节点,默认传值为0 - * - * @param viewCreate - * @return - */ - private Source createViewByDss(ViewCreate viewCreate, User user, String ticketId) { - Source source; - List sourceList = sourceMapper.getByProject(viewCreate.getProjectId()); - if (sourceList != null && sourceList.size() > 0) { - source = sourceList.get(0); - viewCreate.setSourceId(source.getId()); - } else { - // 看看工程中有没有可用source - sourceList = sourceService.getSources(viewCreate.getProjectId(), user, ticketId); - if (sourceList != null && sourceList.size() > 0) { - source = sourceList.get(0); - viewCreate.setSourceId(source.getId()); - return source; - } else { - log.info("source was not found int project,projectId:{}", viewCreate.getProjectId()); - throw new NotFoundException("source is not found,DSS create view failed"); - } - } - return source; - } - /** * update by johnnwang * 保存view方法 - * * @param viewCreate * @param source * @return */ - private ViewWithSourceBaseInfo createViewMethod(ViewCreate viewCreate, Source source) { + private ViewWithSourceBaseInfo createViewMethod(ViewCreate viewCreate,Source source){ View view = new View(); BeanUtils.copyProperties(viewCreate, view); @@ -379,10 +326,6 @@ public boolean updateView(ViewUpdate viewUpdate, User user) throws NotFoundExcep throw new NotFoundException("view is not found"); } - if(!projectAuth.isPorjectOwner(viewWithSource.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectDetail projectDetail = projectService.getProjectDetail(viewWithSource.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); @@ -400,14 +343,11 @@ public boolean updateView(ViewUpdate viewUpdate, User user) throws NotFoundExcep log.info("source not found"); throw new NotFoundException("source is not found"); } - //判断下model数据长度,如果数据太长,不允许保存 - String model = viewUpdate.getModel(); - if (!StringUtils.isEmpty(model) && model.length() > Constants.TEXT_MAX_LENGTH) { - throw new ServerException("your saved view data is too long"); - } - - //如果为hive数据源则直接修改 - if (VisualisUtils.isLinkisDataSource(source)) { + /** + *update by johnnwang + * 如果为hive数据源则直接修改 + */ + if(VisualisUtils.isHiveDataSource(source)){ View view = new View(); BeanUtils.copyProperties(viewUpdate, view); view.setProjectId(projectDetail.getId()); @@ -460,10 +400,6 @@ public boolean deleteView(Long id, User user) throws NotFoundException, UnAuthor throw new NotFoundException("view is not found"); } - if(!projectAuth.isPorjectOwner(view.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - ProjectDetail projectDetail = null; try { projectDetail = projectService.getProjectDetail(view.getProjectId(), user, false); @@ -494,14 +430,13 @@ public boolean deleteView(Long id, User user) throws NotFoundException, UnAuthor /** * 获得默认的SourceWithProject - * * @param * @param user * @return */ - public SourceWithProject getDefaultSourceWithProject(Long sourceId, User user) { + private SourceWithProject getDefaultSourceWithProject(Long sourceId, User user){ SourceWithProject sourceWithProject = new SourceWithProject(); - if (VisualisUtils.getHiveDataSourceId() == sourceId || VisualisUtils.getPrestoDataSourceId() == sourceId) { + if(VisualisUtils.getHiveDataSourceId() == sourceId ){ Source source = sourceMapper.getById(sourceId); Project project = new Project(); project.setName(VisualisUtils.DEFAULT_PROJECT_NAME().getValue()); @@ -521,195 +456,81 @@ public SourceWithProject getDefaultSourceWithProject(Long sourceId, User user) * @param user * @return */ - @SuppressWarnings("unchecked") @Override public PaginateWithQueryColumns executeSql(ViewExecuteSql executeSql, User user) throws NotFoundException, UnAuthorizedExecption, ServerException { Source source = sourceMapper.getById(executeSql.getSourceId()); - if (source == null && VisualisUtils.isLinkisDataSource(source)) { - source = getDefaultSourceWithProject(source.getId(), user); - } + if(source == null && VisualisUtils.isHiveDataSource(source)){ + source = getDefaultSourceWithProject(source.getId(),user); + } if (null == source) { - throw new NotFoundException("source is not found"); - } + throw new NotFoundException("source is not found"); + } - ProjectDetail projectDetail = projectService.getProjectDetail(source.getProjectId(), user, false); + ProjectDetail projectDetail = projectService.getProjectDetail(source.getProjectId(), user, false); - ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); + ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); if (projectPermission.getSourcePermission() == UserPermissionEnum.HIDDEN.getPermission() || projectPermission.getViewPermission() < UserPermissionEnum.WRITE.getPermission()) { - throw new UnAuthorizedExecption("you have not permission to execute sql"); - } + throw new UnAuthorizedExecption("you have not permission to execute sql"); + } - //结构化Sql - PaginateWithQueryColumns paginateWithQueryColumns = null; + //结构化Sql + PaginateWithQueryColumns paginateWithQueryColumns = null; try { - SqlEntity sqlEntity = sqlParseUtils.parseSql(executeSql.getSql(), executeSql.getVariables(), sqlTempDelimiter); - if (null != sqlUtils && null != sqlEntity) { - if (!StringUtils.isEmpty(sqlEntity.getSql())) { - - if (isMaintainer(user, projectDetail)) { - sqlEntity.setAuthParams(null); - } - - if (!CollectionUtils.isEmpty(sqlEntity.getQuaryParams())) { - sqlEntity.getQuaryParams().forEach((k, v) -> { - if (v instanceof List && ((List) v).size() > 0) { - v = ((List) v).stream().collect(Collectors.joining(COMMA)).toString(); - } - sqlEntity.getQuaryParams().put(k, v); - }); - } - - String srcSql = sqlParseUtils.replaceParams(sqlEntity.getSql(), sqlEntity.getQuaryParams(), sqlEntity.getAuthParams(), sqlTempDelimiter, user); + SqlEntity sqlEntity = sqlParseUtils.parseSql(executeSql.getSql(), executeSql.getVariables(), sqlTempDelimiter); + if (null != sqlUtils && null != sqlEntity) { + if (!StringUtils.isEmpty(sqlEntity.getSql())) { - SqlUtils sqlUtils = this.sqlUtils.init(source); - - List executeSqlList = sqlParseUtils.getSqls(srcSql, false); - - List querySqlList = sqlParseUtils.getSqls(srcSql, true); - - if (VisualisUtils.isLinkisDataSource(source)) { - List limitedQuerySqlList = Lists.newArrayList(); - for (String querySql : querySqlList) { - String limitedQuerySql; - if (!org.apache.commons.lang.StringUtils.containsIgnoreCase(querySql, "limit") - && executeSql.getLimit() > 0) { - if (org.apache.commons.lang.StringUtils.containsIgnoreCase(querySql, ";")) { - limitedQuerySql = querySql.replaceAll(";", " limit " + executeSql.getLimit() + ";"); - } else { - limitedQuerySql = querySql + " limit " + executeSql.getLimit() + ";"; - } - limitedQuerySqlList.add(limitedQuerySql); - } else { - limitedQuerySqlList.add(querySql); - } - } - srcSql = org.apache.commons.lang.StringUtils.join(executeSqlList, ";"); - if (org.apache.commons.lang.StringUtils.isNotBlank(srcSql)) { - srcSql = srcSql + ";"; + if (isMaintainer(user, projectDetail)) { + sqlEntity.setAuthParams(null); } - srcSql = srcSql + org.apache.commons.lang.StringUtils.join(limitedQuerySqlList, ";"); - paginateWithQueryColumns = sqlUtils.syncQuery4Paginate(getRunningScript(user, source, null, projectDetail, false, srcSql, false, 300L), null, null, null, executeSql.getLimit(), null); - } else { - if (!CollectionUtils.isEmpty(executeSqlList)) { - executeSqlList.forEach(sql -> sqlUtils.execute(sql)); - } - if (!CollectionUtils.isEmpty(querySqlList)) { - for (String sql : querySqlList) { - paginateWithQueryColumns = sqlUtils.syncQuery4Paginate(sql, null, null, null, executeSql.getLimit(), null); - } + if (!CollectionUtils.isEmpty(sqlEntity.getQuaryParams())) { + sqlEntity.getQuaryParams().forEach((k, v) -> { + if (v instanceof List && ((List) v).size() > 0) { + v = ((List) v).stream().collect(Collectors.joining(COMMA)).toString(); + } + sqlEntity.getQuaryParams().put(k, v); + }); } - } - - } else { - log.warn("sql is empty, we will ignore it"); - throw new ServerException("您提交的sql是空"); - } - } - } catch (Exception e) { - e.printStackTrace(); - throw new ServerException(e.getMessage()); - } - return paginateWithQueryColumns; -} -@SuppressWarnings("unchecked") -public PaginateWithExecStatus AsyncSubmitSql(ViewExecuteSql executeSql, User user) throws NotFoundException, UnAuthorizedExecption, ServerException{ - - Source source = sourceMapper.getById(executeSql.getSourceId()); - if (source == null && VisualisUtils.isLinkisDataSource(source)) { - source = getDefaultSourceWithProject(source.getId(), user); - } - if (null == source) { - throw new NotFoundException("source is not found"); - } + String srcSql = sqlParseUtils.replaceParams(sqlEntity.getSql(), sqlEntity.getQuaryParams(), sqlEntity.getAuthParams(), sqlTempDelimiter, user); - ProjectDetail projectDetail = projectService.getProjectDetail(source.getProjectId(), user, false); + SqlUtils sqlUtils = this.sqlUtils.init(source); - ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); + List executeSqlList = sqlParseUtils.getSqls(srcSql, false); - if (projectPermission.getSourcePermission() == UserPermissionEnum.HIDDEN.getPermission() - || projectPermission.getViewPermission() < UserPermissionEnum.WRITE.getPermission()) { - throw new UnAuthorizedExecption("you have not permission to execute sql"); - } + List querySqlList = sqlParseUtils.getSqls(srcSql, true); - //结构化Sql - PaginateWithExecStatus paginateWithExecStatus = null; - try { - SqlEntity sqlEntity = sqlParseUtils.parseSql(executeSql.getSql(), executeSql.getVariables(), sqlTempDelimiter); - if (null != sqlUtils && null != sqlEntity) { - if (!StringUtils.isEmpty(sqlEntity.getSql())) { + if(VisualisUtils.isHiveDataSource(source)){ + //srcSql = srcSql.substring(1, srcSql.length() - 1); - if (isMaintainer(user, projectDetail)) { - sqlEntity.setAuthParams(null); - } + paginateWithQueryColumns = sqlUtils.syncQuery4Paginate(getRunningScript(user, source, null, projectDetail, false, srcSql), null, null, null, executeSql.getLimit(), null); - if (!CollectionUtils.isEmpty(sqlEntity.getQuaryParams())) { - sqlEntity.getQuaryParams().forEach((k, v) -> { - if (v instanceof List && ((List) v).size() > 0) { - v = ((List) v).stream().collect(Collectors.joining(COMMA)).toString(); + } else { + if (!CollectionUtils.isEmpty(executeSqlList)) { + executeSqlList.forEach(sql -> sqlUtils.execute(sql)); } - sqlEntity.getQuaryParams().put(k, v); - }); - } - - String srcSql = sqlParseUtils.replaceParams(sqlEntity.getSql(), sqlEntity.getQuaryParams(), sqlEntity.getAuthParams(), sqlTempDelimiter, user); - - SqlUtils sqlUtils = this.sqlUtils.init(source); - - List executeSqlList = sqlParseUtils.getSqls(srcSql, false); - - List querySqlList = sqlParseUtils.getSqls(srcSql, true); - - if (VisualisUtils.isLinkisDataSource(source)) { - List limitedQuerySqlList = Lists.newArrayList(); - for (String querySql : querySqlList) { - String limitedQuerySql; - if (!org.apache.commons.lang.StringUtils.containsIgnoreCase(querySql, "limit") - && executeSql.getLimit() > 0) { - if (org.apache.commons.lang.StringUtils.containsIgnoreCase(querySql, ";")) { - limitedQuerySql = querySql.replaceAll(";", " limit " + executeSql.getLimit() + ";"); - } else { - limitedQuerySql = querySql + " limit " + executeSql.getLimit() + ";"; + if (!CollectionUtils.isEmpty(querySqlList)) { + for (String sql : querySqlList) { + paginateWithQueryColumns = sqlUtils.syncQuery4Paginate(sql, null, null, null, executeSql.getLimit(), null); } - limitedQuerySqlList.add(limitedQuerySql); - } else { - limitedQuerySqlList.add(querySql); } } - srcSql = org.apache.commons.lang.StringUtils.join(executeSqlList, ";"); - if (org.apache.commons.lang.StringUtils.isNotBlank(srcSql)) { - srcSql = srcSql + ";"; - } - srcSql = srcSql + org.apache.commons.lang.StringUtils.join(limitedQuerySqlList, ";"); - paginateWithExecStatus = sqlUtils.asyncQuery4Exec(getRunningScript(user, source, null, projectDetail, false, srcSql, false, 300L), null, null, null, executeSql.getLimit(), null); - } else { - // TODO: 2021/10/30 jdbc执行还不支持异步 -/* if (!CollectionUtils.isEmpty(executeSqlList)) { - executeSqlList.forEach(sql -> sqlUtils.execute(sql)); - } - if (!CollectionUtils.isEmpty(querySqlList)) { - for (String sql : querySqlList) { - paginateWithExecStatus = sqlUtils.asyncQuery4Exec(sql, null, null, null, executeSql.getLimit(), null); - } - }*/ + }else{ + log.warn("sql is empty, we will ignore it"); + throw new ServerException("您提交的sql是空"); } - - } else { - log.warn("sql is empty, we will ignore it"); - throw new ServerException("您提交的sql是空"); } + } catch (Exception e) { + e.printStackTrace(); + throw new ServerException(e.getMessage()); } - } catch (Exception e) { - e.printStackTrace(); - throw new ServerException(e.getMessage()); + return paginateWithQueryColumns; } - return paginateWithExecStatus; -} private boolean isMaintainer(User user, ProjectDetail projectDetail) { return projectService.isMaintainer(projectDetail, user); @@ -721,11 +542,10 @@ private boolean isMaintainer(User user, ProjectDetail projectDetail) { * @param id * @param executeParam * @param user - * @param async * @return */ @Override - public Paginate> getData(Long id, ViewExecuteParam executeParam, User user, boolean async) throws NotFoundException, UnAuthorizedExecption, ServerException, SQLException { + public Paginate> getData(Long id, ViewExecuteParam executeParam, User user) throws NotFoundException, UnAuthorizedExecption, ServerException, SQLException { if (null == executeParam || (CollectionUtils.isEmpty(executeParam.getGroups()) && CollectionUtils.isEmpty(executeParam.getAggregators()))) { return null; } @@ -735,25 +555,6 @@ public Paginate> getData(Long id, ViewExecuteParam executePa log.info("view (:{}) not found", id); throw new NotFoundException("view is not found"); } - if (EnvLimitUtils.isProdEnv()) { - executeParam.setEngineType(VisualisUtils.SPARK().getValue()); - } - if (org.apache.commons.lang.StringUtils.isNotBlank(executeParam.getEngineType())) { - String dataSourceName = VisualisUtils.getDataSourceName(executeParam.getEngineType()) + "DataSource"; - if (!dataSourceName.equalsIgnoreCase(viewWithSource.getSource().getName())) { - Long realDataSourceId = sourceMapper.getByNameWithProjectId(dataSourceName, viewWithSource.getProjectId()); - Source realDataSource = sourceMapper.getById(realDataSourceId); - if (realDataSource == null) { - List sources = sourceService.getSources(viewWithSource.getProjectId(), user, ""); - for (Source source : sources) { - if (source.getName().contains(dataSourceName)) { - realDataSource = source; - } - } - } - viewWithSource.setSource(realDataSource); - } - } ProjectDetail projectDetail = projectService.getProjectDetail(viewWithSource.getProjectId(), user, false); @@ -764,47 +565,9 @@ public Paginate> getData(Long id, ViewExecuteParam executePa } boolean maintainer = projectService.isMaintainer(projectDetail, user); - return getResultDataList(maintainer, viewWithSource, executeParam, user, async); - } - - public Paginate> getAsyncProgress(String execId, User user) throws Exception { - if (JdbcAsyncUtils.isJdbcExecId(execId)) { - return JdbcAsyncUtils.getJdbcProgress(execId); - } - BaseSource source = null; - if (NumberUtils.isDigits(execId)) { - source = sourceMapper.getById(VisualisUtils.getPrestoDataSourceId()); - } - SqlUtils sqlUtils = this.sqlUtils.init(source); - return sqlUtils.getProgress4Exec(execId, user.username); - } - - public Paginate> killAsyncJob(String execId, User user) throws Exception { - if (JdbcAsyncUtils.isJdbcExecId(execId)) { - return JdbcAsyncUtils.getJdbcProgress(execId); - } - BaseSource source = null; - if (NumberUtils.isDigits(execId)) { - source = sourceMapper.getById(VisualisUtils.getPrestoDataSourceId()); - } - SqlUtils sqlUtils = this.sqlUtils.init(source); - return sqlUtils.kill4Exec(execId, user.username); + return getResultDataList(maintainer, viewWithSource, executeParam, user); } - - public Paginate> getAsyncResult(String execId, User user) throws Exception { - if (JdbcAsyncUtils.isJdbcExecId(execId)) { - return JdbcAsyncUtils.getResult(execId); - } - BaseSource source = null; - if (NumberUtils.isDigits(execId)) { - source = sourceMapper.getById(VisualisUtils.getPrestoDataSourceId()); - } - SqlUtils sqlUtils = this.sqlUtils.init(source); - return sqlUtils.getResultSet4Exec(execId, user.username); - } - - public void buildQuerySql(List querySqlList, Source source, ViewExecuteParam executeParam) { if (null != executeParam) { //构造参数, 原有的被传入的替换 @@ -831,24 +594,24 @@ public void buildQuerySql(List querySqlList, Source source, ViewExecuteP } } - public List convertFilters(List filterStrs, Source source) { + public List convertFilters(List filterStrs, Source source){ List whereClauses = new ArrayList<>(); List filters = new ArrayList<>(); - try { - if (null == filterStrs || filterStrs.isEmpty()) { + try{ + if(null == filterStrs || filterStrs.isEmpty()){ return null; } - for (String str : filterStrs) { + for(String str : filterStrs){ SqlFilter obj = JSON.parseObject(str, SqlFilter.class); - if (!StringUtils.isEmpty(obj.getName())) { + if(!StringUtils.isEmpty(obj.getName())){ obj.setName(ViewExecuteParam.getField(obj.getName(), source.getJdbcUrl(), source.getDbVersion())); } filters.add(obj); } filters.forEach(filter -> whereClauses.add(SqlFilter.dealFilter(filter))); - } catch (Exception e) { + }catch (Exception e){ log.error("convertFilters error . filterStrs = {}, source = {}, filters = {} , whereClauses = {} ", JSON.toJSON(filterStrs), JSON.toJSON(source), JSON.toJSON(filters), JSON.toJSON(whereClauses)); throw e; @@ -864,13 +627,12 @@ public List convertFilters(List filterStrs, Source source) { * @param viewWithSource * @param executeParam * @param user - * @param async * @return * @throws ServerException */ @Override - public PaginateWithQueryColumns getResultDataList(boolean isMaintainer, ViewWithSource viewWithSource, ViewExecuteParam executeParam, User user, boolean async) throws ServerException, SQLException { - PaginateWithQueryColumns paginate = new PaginateWithQueryColumns(); + public PaginateWithQueryColumns getResultDataList(boolean isMaintainer, ViewWithSource viewWithSource, ViewExecuteParam executeParam, User user) throws ServerException, SQLException { + PaginateWithQueryColumns paginate = null; if (null == executeParam || (CollectionUtils.isEmpty(executeParam.getGroups()) && CollectionUtils.isEmpty(executeParam.getAggregators()))) { return null; @@ -883,8 +645,6 @@ public PaginateWithQueryColumns getResultDataList(boolean isMaintainer, ViewWith String cacheKey = null; try { - ChartUtils.processViewExecuteParam(executeParam); - if (!StringUtils.isEmpty(viewWithSource.getSql())) { //解析变量 List variables = viewWithSource.getVariables(); @@ -904,94 +664,56 @@ public PaginateWithQueryColumns getResultDataList(boolean isMaintainer, ViewWith if (!CollectionUtils.isEmpty(querySqlList)) { executeParam.addExcludeColumn(excludeColumns, source.getJdbcUrl(), source.getDbVersion()); -// if (null != executeParam -// && null != executeParam.getCache() -// && executeParam.getCache() -// && executeParam.getExpired() > 0L) { -// -// StringBuilder slatBuilder = new StringBuilder(); -// slatBuilder.append(executeParam.getPageNo()); -// slatBuilder.append(MINUS); -// slatBuilder.append(executeParam.getLimit()); -// slatBuilder.append(MINUS); -// slatBuilder.append(executeParam.getPageSize()); -// excludeColumns.forEach(slatBuilder::append); -// -// cacheKey = MD5Util.getMD5(slatBuilder.toString() + querySqlList.get(querySqlList.size() - 1), true, 32); -// -// if (!executeParam.getFlush()) { -// try { -// Object object = redisUtils.get(cacheKey); -// if (null != object && executeParam.getCache()) { -// paginate = (PaginateWithQueryColumns) object; -// return paginate; -// } -// } catch (Exception e) { -// log.warn("get data by cache: {}", e.getMessage()); -// } -// } -// } - - if (VisualisUtils.isLinkisDataSource(source)) { - Project project = projectService.getProjectDetail(source.getProjectId(), user, false); - String viewSql = querySqlList.get(0); - CacheTaskResult cacheTaskResult = null; - if (executeParam.getCache()) { - cacheTaskResult = findOrSubmitCache(sqlUtils, viewSql, user, source, viewWithSource, project, executeParam.getExpired()); + if (null != executeParam + && null != executeParam.getCache() + && executeParam.getCache() + && executeParam.getExpired() > 0L) { + + StringBuilder slatBuilder = new StringBuilder(); + slatBuilder.append(executeParam.getPageNo()); + slatBuilder.append(MINUS); + slatBuilder.append(executeParam.getLimit()); + slatBuilder.append(MINUS); + slatBuilder.append(executeParam.getPageSize()); + excludeColumns.forEach(slatBuilder::append); + + cacheKey = MD5Util.getMD5(slatBuilder.toString() + querySqlList.get(querySqlList.size() - 1), true, 32); + + if (!executeParam.getFlush()) { + try { + Object object = redisUtils.get(cacheKey); + if (null != object && executeParam.getCache()) { + paginate = (PaginateWithQueryColumns) object; + return paginate; + } + } catch (Exception e) { + log.warn("get data by cache: {}", e.getMessage()); + } } - if (cacheTaskResult != null) { - buildScala(querySqlList, executeParam, source, cacheTaskResult); + } + + if(VisualisUtils.isHiveDataSource(source)){ + Project project = projectService.getProjectDetail(source.getProjectId(), user, false); + if(VisualisUtils.isFirstTime(viewWithSource)){ + buildScala(querySqlList,sqlEntity,executeParam,source,viewWithSource, user); for (String sql : querySqlList) { - if (executeParam.getFlush()) { - VisualisUtils.deleteCache(sql, user.username); - VisualisUtils.deleteCache(viewSql, user.username); - break; - } - paginate = async ? - sqlUtils.asyncQuery4Exec( - getRunningScript(user, source, viewWithSource, project, true, sql, executeParam.getCache(), executeParam.getExpired()), - executeParam.getPageNo(), - executeParam.getPageSize(), - executeParam.getTotalCount(), - executeParam.getLimit(), - excludeColumns) - : - sqlUtils.syncQuery4Paginate( - getRunningScript(user, source, viewWithSource, project, true, sql, executeParam.getCache(), executeParam.getExpired()), - executeParam.getPageNo(), - executeParam.getPageSize(), - executeParam.getTotalCount(), - executeParam.getLimit(), - excludeColumns); + paginate = sqlUtils.syncQuery4Paginate( + getRunningScript(user, source, viewWithSource, project, true, sql), + executeParam.getPageNo(), + executeParam.getPageSize(), + executeParam.getTotalCount(), + executeParam.getLimit(), + excludeColumns); } } else { buildQuerySql(querySqlList, source, executeParam); - String script = String.join(Consts.SEMICOLON, executeSqlList); - if (org.apache.commons.lang.StringUtils.isNotBlank(script)) { - script = script + Consts.SEMICOLON; - } - script = script + String.join(Consts.SEMICOLON, querySqlList); - if (executeParam.getFlush()) { - VisualisUtils.deleteCache(script, user.username); - VisualisUtils.deleteCache(viewSql, user.username); - return paginate; - } - paginate = async ? - sqlUtils.asyncQuery4Exec( - getRunningScript(user, source, viewWithSource, project, false, script, executeParam.getCache(), executeParam.getExpired()), - executeParam.getPageNo(), - executeParam.getPageSize(), - executeParam.getTotalCount(), - executeParam.getLimit(), - excludeColumns) - : - sqlUtils.query4Paginate( - getRunningScript(user, source, viewWithSource, project, false, script, executeParam.getCache(), executeParam.getExpired()), - executeParam.getPageNo(), - executeParam.getPageSize(), - executeParam.getTotalCount(), - executeParam.getLimit(), - excludeColumns); + paginate = sqlUtils.syncQuery4Paginate( + getRunningScript(user, source, viewWithSource, project,false, String.join(Consts.SEMICOLON, executeSqlList) + Consts.SEMICOLON + String.join(Consts.SEMICOLON, querySqlList)), + executeParam.getPageNo(), + executeParam.getPageSize(), + executeParam.getTotalCount(), + executeParam.getLimit(), + excludeColumns); } } else { buildQuerySql(querySqlList, source, executeParam); @@ -1008,68 +730,71 @@ public PaginateWithQueryColumns getResultDataList(boolean isMaintainer, ViewWith executeParam.getLimit(), excludeColumns); } - // fake async for jdbc - String execId = JdbcAsyncUtils.generateExecId(); - JdbcAsyncUtils.putResult(execId, paginate); - paginate = JdbcAsyncUtils.getJdbcProgress(execId); } } } } catch (Exception e) { - log.error("failed to get resultSet ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } -// if (null != executeParam -// && null != executeParam.getCache() -// && executeParam.getCache() -// && executeParam.getExpired() > 0L -// && null != paginate && !CollectionUtils.isEmpty(paginate.getResultList())) { -// redisUtils.set(cacheKey, paginate, executeParam.getExpired(), TimeUnit.SECONDS); -// } + if (null != executeParam + && null != executeParam.getCache() + && executeParam.getCache() + && executeParam.getExpired() > 0L + && null != paginate && !CollectionUtils.isEmpty(paginate.getResultList())) { + redisUtils.set(cacheKey, paginate, executeParam.getExpired(), TimeUnit.SECONDS); + } return paginate; } - private CacheTaskResult findOrSubmitCache(SqlUtils sqlUtils, String querySql, User user, Source source, View view, Project project, Long expired) throws Exception { - CacheTaskResult cacheTaskResult = JobHistoryHelper.getCache(querySql, user.username, Lists.newArrayList(VisualisUtils.SPARK().getValue() + "-*"), expired); - if (cacheTaskResult == null) { - sqlUtils.syncQuery4Paginate(getRunningScript(user, source, view, project, false, "--set ide.engine.no.limit.allow=true\n" + querySql, true, expired), 0, 0, 0, -1, null); - } - cacheTaskResult = JobHistoryHelper.getCache(querySql, user.username, Lists.newArrayList(VisualisUtils.SPARK().getValue() + "-*"), expired); - return cacheTaskResult; - } - - private String getRunningScript(User user, Source source, View view, Project project, Boolean isScala, String script, Boolean cache, Long expired) { - if (!VisualisUtils.isLinkisDataSource(source)) { - return script; + private String getRunningScript(User user,Source source, View view, Project project, Boolean isFirst,String script){ + if(! VisualisUtils.isHiveDataSource(source)){ + return script; } else { UJESJob ujesJob = null; String querySource = project.getName(); - if (view != null) { + if(view != null){ querySource = querySource + "/" + view.getName(); } - HashMap sourceMap = Maps.newHashMap(); + HashMap sourceMap = Maps.newHashMap(); sourceMap.put("fileName", querySource); - if (isScala) { - ujesJob = UJESJob.apply(script, user.getName(), UJESJob.SCALA_TYPE(), sourceMap, cache, expired, cache, expired); + if (isFirst){ + ujesJob = new UJESJob(script,user.getName(),UJESJob.SCALA_TYPE(), sourceMap ); } else { - ujesJob = UJESJob.apply(script, user.getName(), UJESJob.SQL_TYPE(), sourceMap, cache, expired, cache, expired); - if (VisualisUtils.isPrestoDataSource(source)) { - ujesJob.engine_$eq(UJESJob.PRESTO_ENGINE()); - ujesJob.jobType_$eq(UJESJob.PSQL_TYPE()); - } + ujesJob = new UJESJob(script,user.getName(),UJESJob.SQL_TYPE(), sourceMap); } - return LinkisUtils.gsonNoContert().toJson(ujesJob); + return BDPJettyServerHelper.gson().toJson(ujesJob); } } - - private void buildScala(List querySqlList, ViewExecuteParam executeParam, Source source, CacheTaskResult cacheTaskResult) { - String tempViewName = "view_res_" + FilenameUtils.getBaseName(cacheTaskResult.getResultLocation()); - String resultLocation = cacheTaskResult.getResultLocation() + VisualisUtils.RESULT_FILE_NAME().getValue(); - querySqlList.set(0, tempViewName); + /** + * update by johnnwang + * @param querySqlList + * @param sqlEntity + * @param executeParam + * @param source + * @param view + */ + private void buildScala(List querySqlList, SqlEntity sqlEntity, ViewExecuteParam executeParam, Source source,View view, User user){ + querySqlList.set((querySqlList.size()-1),view.getName()); buildQuerySql(querySqlList, source, executeParam); - querySqlList.set(0, VisualisUtils.buildScala(querySqlList.get(0), resultLocation, tempViewName)); + + JSONObject jsonObject = JSONObject.parseObject(view.getConfig()); + String dwcResultInfoKey = VisualisUtils.DWC_RESULT_INFO().getValue(); + if (null != jsonObject && jsonObject.containsKey(dwcResultInfoKey)) { + DWCResultInfo dwcResultInfo = BDPJettyServerHelper.gson().fromJson(jsonObject.getString(dwcResultInfoKey), DWCResultInfo.class); + //update tmp view result info + Project project = projectService.getProjectDetail(source.getProjectId(), user, false); + String resultPath = String.join(",", sqlUtils.querySQLWithResultSetLocation(getRunningScript(user, source, view, project, false, dwcResultInfo.getExecutionCode()), executeParam.getLimit())); + log.info("got new tmp view result path: " + resultPath); + dwcResultInfo.setResultPath(resultPath); + jsonObject.put(dwcResultInfoKey, dwcResultInfo); + view.setConfig(JSONObject.toJSONString(jsonObject)); + viewMapper.update(view); + + querySqlList.set(0, VisualisUtils.buildScala(querySqlList.get(0),dwcResultInfo,view.getName())); + } } @Override @@ -1124,24 +849,24 @@ public List> getDistinctValueData(boolean isMaintainer, View String sql = st.render(); querySqlList.set(querySqlList.size() - 1, sql); -// if (null != param.getCache() && param.getCache() && param.getExpired().longValue() > 0L) { -// cacheKey = MD5Util.getMD5("DISTINCI" + sql, true, 32); -// -// try { -// Object object = redisUtils.get(cacheKey); -// if (null != object) { -// return (List) object; -// } -// } catch (Exception e) { -// log.warn("get distinct value by cache: {}", e.getMessage()); -// } -// } + if (null != param.getCache() && param.getCache() && param.getExpired().longValue() > 0L) { + cacheKey = MD5Util.getMD5("DISTINCI" + sql, true, 32); + + try { + Object object = redisUtils.get(cacheKey); + if (null != object) { + return (List) object; + } + } catch (Exception e) { + log.warn("get distinct value by cache: {}", e.getMessage()); + } + } } List> list = null; - if (VisualisUtils.isLinkisDataSource(source)) { + if(VisualisUtils.isHiveDataSource(source)){ Project project = projectService.getProjectDetail(source.getProjectId(), user, false); - list = sqlUtils.query4List(getRunningScript(user, source, viewWithSource, project, false, String.join(Consts.SEMICOLON, executeSqlList) + Consts.SEMICOLON + String.join(Consts.SEMICOLON, querySqlList), param.getCache(), param.getExpired()), -1); + list = sqlUtils.query4List(getRunningScript(user, source, viewWithSource, project, false, String.join(Consts.SEMICOLON, executeSqlList) + Consts.SEMICOLON + String.join(Consts.SEMICOLON, querySqlList)), -1); } else { if (!CollectionUtils.isEmpty(executeSqlList)) { executeSqlList.forEach(sql -> sqlUtils.execute(sql)); @@ -1151,9 +876,9 @@ public List> getDistinctValueData(boolean isMaintainer, View } } -// if (null != param.getCache() && param.getCache() && param.getExpired().longValue() > 0L) { -// redisUtils.set(cacheKey, list, param.getExpired(), TimeUnit.SECONDS); -// } + if (null != param.getCache() && param.getCache() && param.getExpired().longValue() > 0L) { + redisUtils.set(cacheKey, list, param.getExpired(), TimeUnit.SECONDS); + } if (null != list) { return list; @@ -1161,7 +886,7 @@ public List> getDistinctValueData(boolean isMaintainer, View } } } catch (Exception e) { - log.error("failed to get distinct value data, ", e); + e.printStackTrace(); throw new ServerException(e.getMessage()); } @@ -1190,7 +915,6 @@ private List getQueryVariables(List variables) { return null; } - @SuppressWarnings("unchecked") private List getAuthVariables(List roleViewList, List variables) { if (!CollectionUtils.isEmpty(variables)) { @@ -1241,7 +965,6 @@ private List getAuthVariables(List roleViewList, List< return null; } - @SuppressWarnings("unchecked") private void packageParams(boolean isProjectMaintainer, Long viewId, SqlEntity sqlEntity, List variables, List paramList, Set excludeColumns, User user) { List queryVariables = getQueryVariables(variables); @@ -1331,7 +1054,7 @@ private void packageParams(boolean isProjectMaintainer, Long viewId, SqlEntity s throw (ServerException) e.getCause(); } } catch (InterruptedException e) { - log.error("thread package params get interrupted", e); + e.printStackTrace(); } finally { executorService.shutdown(); } @@ -1348,7 +1071,6 @@ private void packageParams(boolean isProjectMaintainer, Long viewId, SqlEntity s } - @SuppressWarnings("unchecked") private void checkAndInsertRoleParam(String sqlVarible, List roles, User user, View view) { List variables = JSONObject.parseArray(sqlVarible, SqlVariable.class); diff --git a/server/src/main/java/edp/davinci/service/impl/WidgetServiceImpl.java b/server/src/main/java/edp/davinci/service/impl/WidgetServiceImpl.java index a1ba942ef..615ee1546 100644 --- a/server/src/main/java/edp/davinci/service/impl/WidgetServiceImpl.java +++ b/server/src/main/java/edp/davinci/service/impl/WidgetServiceImpl.java @@ -20,7 +20,6 @@ package edp.davinci.service.impl; import com.alibaba.druid.util.StringUtils; -import com.webank.wedatasphere.dss.visualis.auth.ProjectAuth; import edp.core.exception.NotFoundException; import edp.core.exception.ServerException; import edp.core.exception.UnAuthorizedExecption; @@ -29,7 +28,6 @@ import edp.core.utils.CollectionUtils; import edp.core.utils.FileUtils; import edp.core.utils.ServerUtils; -import edp.davinci.common.utils.ComponentFilterUtils; import edp.davinci.core.enums.FileTypeEnum; import edp.davinci.core.enums.LogNameEnum; import edp.davinci.core.enums.UserPermissionEnum; @@ -54,7 +52,6 @@ import edp.davinci.service.ViewService; import edp.davinci.service.WidgetService; import lombok.extern.slf4j.Slf4j; -import org.apache.linkis.server.BDPJettyServerHelper; import org.apache.poi.ss.usermodel.Sheet; import org.apache.poi.xssf.streaming.SXSSFWorkbook; import org.slf4j.Logger; @@ -75,8 +72,8 @@ import java.util.concurrent.Executors; import static edp.core.consts.Consts.EMPTY; -import static edp.davinci.common.utils.ScriptUtils.getExecuptParamScriptEngine; -import static edp.davinci.common.utils.ScriptUtils.getViewExecuteParam; +import static edp.davinci.common.utils.ScriptUtiils.getExecuptParamScriptEngine; +import static edp.davinci.common.utils.ScriptUtiils.getViewExecuteParam; @Service("widgetService") @@ -111,9 +108,6 @@ public class WidgetServiceImpl implements WidgetService { @Autowired private ProjectService projectService; - @Autowired - private ProjectAuth projectAuth; - @Override public synchronized boolean isExist(String name, Long id, Long projectId) { Long widgetId = widgetMapper.getByNameWithProjectId(name, projectId); @@ -152,9 +146,6 @@ public List getWidgets(Long projectId, User user) throws NotFoundExcepti } } - ComponentFilterUtils filter = new ComponentFilterUtils(); - widgets = filter.doFilterWidgets(widgets); - return widgets; } @@ -212,14 +203,14 @@ public Widget createWidget(WidgetCreate widgetCreate, User user) throws NotFound View view = viewMapper.getById(widgetCreate.getViewId()); if (null == view) { log.info("view (:{}) is not found", widgetCreate.getViewId()); - //throw new NotFoundException("view not found"); + throw new NotFoundException("view not found"); } Widget widget = new Widget().createdBy(user.getId()); BeanUtils.copyProperties(widgetCreate, widget); int insert = widgetMapper.insert(widget); if (insert > 0) { - optLogger.info("widget ({}) create by user(:{})", widget, user.getUsername()); + optLogger.info("widget ({}) create by user(:{})", widget.toString()); return widget; } else { throw new ServerException("create widget fail"); @@ -242,19 +233,6 @@ public boolean updateWidget(WidgetUpdate widgetUpdate, User user) throws NotFoun throw new NotFoundException("widget is not found"); } - if(!projectAuth.isPorjectOwner(widget.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); - } - - // 如果前端带的widget config中的view为空,设置为viewId的值 - Long viewId = widgetUpdate.getViewId(); - Map widgetUpdateConfig = BDPJettyServerHelper.gson().fromJson(widgetUpdate.getConfig(), Map.class); - if(widgetUpdateConfig.get("view").toString().equals("{}")) { - widgetUpdateConfig.put("view", viewId); - widgetUpdate.setConfig(BDPJettyServerHelper.gson().toJson(widgetUpdateConfig)); - } - - ProjectDetail projectDetail = projectService.getProjectDetail(widget.getProjectId(), user, false); ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); @@ -269,26 +247,19 @@ public boolean updateWidget(WidgetUpdate widgetUpdate, User user) throws NotFoun throw new ServerException("the widget name is already taken"); } - if(widgetUpdate.getViewId() != null && widgetUpdate.getViewId() > 0){ - View view = viewMapper.getById(widgetUpdate.getViewId()); - if (null == view) { - log.info("view (:{}) not found", widgetUpdate.getViewId()); - throw new NotFoundException("view not found"); - } + View view = viewMapper.getById(widgetUpdate.getViewId()); + if (null == view) { + log.info("view (:{}) not found", widgetUpdate.getViewId()); + throw new NotFoundException("view not found"); } String originStr = widget.toString(); BeanUtils.copyProperties(widgetUpdate, widget); - // 判断是否更新过config - if(widgetUpdate.getConfig().equals(widget.getConfig())) { - widget.updateByWithoutUpdateTime(user.getId()); - } else { - widget.updatedBy(user.getId()); - } + widget.updatedBy(user.getId()); int update = widgetMapper.update(widget); if (update > 0) { - optLogger.info("widget ({}) is updated by user(:{}), origin: ({})", widget, user.getId(), originStr); + optLogger.info("widget ({}) is updated by user(:{}), origin: ({})", widget.toString(), user.getId(), originStr); return true; } else { throw new ServerException("update widget fail"); @@ -308,21 +279,17 @@ public boolean deleteWidget(Long id, User user) throws NotFoundException, UnAuth Widget widget = widgetMapper.getById(id); if (null == widget) { - log.warn("widget (:{}) is not found", id); - return true; - } else { - ProjectDetail projectDetail = projectService.getProjectDetail(widget.getProjectId(), user, false); - ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); - - //校验权限 - if (projectPermission.getWidgetPermission() < UserPermissionEnum.DELETE.getPermission()) { - log.info("user {} have not permisson to delete widget", user.getUsername()); - throw new UnAuthorizedExecption("you have not permission to delete widget"); - } + log.info("widget (:{}) is not found", id); + throw new NotFoundException("widget is not found"); } - if(!projectAuth.isPorjectOwner(widget.getProjectId(), user.getId())) { - throw new UnAuthorizedExecption("current user has no permission."); + ProjectDetail projectDetail = projectService.getProjectDetail(widget.getProjectId(), user, false); + ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); + + //校验权限 + if (projectPermission.getWidgetPermission() < UserPermissionEnum.DELETE.getPermission()) { + log.info("user {} have not permisson to delete widget", user.getUsername()); + throw new UnAuthorizedExecption("you have not permission to delete widget"); } //删除引用widget的dashboard @@ -406,7 +373,7 @@ public String generationFile(Long id, ViewExecuteParam executeParam, User user, boolean maintainer = projectService.isMaintainer(projectDetail, user); - PaginateWithQueryColumns paginate = viewService.getResultDataList(maintainer, viewWithSource, executeParam, user, false); + PaginateWithQueryColumns paginate = viewService.getResultDataList(maintainer, viewWithSource, executeParam, user); List columns = paginate.getColumns(); if (!CollectionUtils.isEmpty(columns)) { File file = new File(rootPath); @@ -496,14 +463,14 @@ public File writeExcel(Set widgets, executeParam = getViewExecuteParam((engine), null, widget.getConfig(), null); } - PaginateWithQueryColumns paginate = viewService.getResultDataList(maintainer, viewWithProjectAndSource, executeParam, user, false); + PaginateWithQueryColumns paginate = viewService.getResultDataList(maintainer, viewWithProjectAndSource, executeParam, user); sheet = wb.createSheet(sheetName); ExcelUtils.writeSheet(sheet, paginate.getColumns(), paginate.getResultList(), wb, containType, widget.getConfig(), executeParam.getParams()); } catch (ServerException e) { - log.error("Error writing widget data to excel: ", e); + e.printStackTrace(); } catch (SQLException e) { - log.error("Error writing widget data to excel: ", e); + e.printStackTrace(); } finally { sheet = null; countDownLatch.countDown(); diff --git a/server/src/main/java/edp/davinci/service/screenshot/ScreenshotUtil.java b/server/src/main/java/edp/davinci/service/screenshot/ScreenshotUtil.java index 21dca0020..7fa6c396c 100644 --- a/server/src/main/java/edp/davinci/service/screenshot/ScreenshotUtil.java +++ b/server/src/main/java/edp/davinci/service/screenshot/ScreenshotUtil.java @@ -21,11 +21,15 @@ import com.alibaba.druid.util.StringUtils; import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig; +import com.webank.wedatasphere.linkis.adapt.LinkisUtils; import edp.core.utils.ServerUtils; +import edp.davinci.dao.CronJobMapper; import edp.davinci.dao.UserMapper; +import edp.davinci.model.CronJob; import edp.davinci.model.User; +import edp.davinci.service.CronJobService; +import edp.davinci.service.UserService; import lombok.extern.slf4j.Slf4j; -import org.apache.linkis.adapt.LinkisUtils; import org.openqa.selenium.*; import org.openqa.selenium.chrome.ChromeDriver; import org.openqa.selenium.chrome.ChromeDriverService; @@ -79,7 +83,7 @@ public class ScreenshotUtil { public void screenshot(long userId, long jobId, List imageContents) { User user = userMapper.getById(userId); - log.info("start screenshot for job: {}, and set screenshot time out second is: {}", jobId, timeOutSecond); + log.info("start screenshot for job: {}", jobId); try { CountDownLatch countDownLatch = new CountDownLatch(imageContents.size()); List futures = new ArrayList<>(imageContents.size()); @@ -87,15 +91,10 @@ public void screenshot(long userId, long jobId, List imageContents log.info("thread for screenshot start, type: {}, id: {}", content.getDesc(), content.getCId()); try { File image = doScreenshot(content.getUrl(), user.username); - if (null != image) { - log.info("Finished doing screenshot, file path: {}", image.getAbsolutePath()); - content.setContent(image); - } else { - log.info("Screenshot failed. Set the picture content to null."); - content.setContent(null); - } + content.setContent(image); } catch (Exception e) { log.error("error ScreenshotUtil.screenshot, ", e); + e.printStackTrace(); } finally { countDownLatch.countDown(); log.info("thread for screenshot finish, type: {}, id: {}", content.getDesc(), content.getCId()); @@ -113,7 +112,7 @@ public void screenshot(long userId, long jobId, List imageContents imageContents.sort(Comparator.comparing(ImageContent::getOrder)); } catch (InterruptedException e) { - log.error("screenshot thread gets interrupted, ", e); + e.printStackTrace(); } finally { log.info("finish screenshot for job: {}", jobId); } @@ -121,71 +120,44 @@ public void screenshot(long userId, long jobId, List imageContents private File doScreenshot(String url, String username) throws Exception { - url = getUrlWithEnv(url); WebDriver driver = generateWebDriver(); - driver.get(serverUtils.getServerUrl()); - Cookie ticketCookie = new Cookie(CommonConfig.TICKET_ID_STRING().getValue(), LinkisUtils.getUserTicketKV(username)._2, - serverUtils.getAccessAddress(), "/", new Date(System.currentTimeMillis() + 1000 * 60 * 60 * 24 * 30L)); - - Cookie innerCookie = new Cookie("dataworkcloud_inner_request", "true", serverUtils.getAccessAddress(), - "/", new Date(System.currentTimeMillis() + 1000 * 60 * 60 * 24 * 30L)); - + Cookie ticketCookie = new Cookie(CommonConfig.TICKET_ID_STRING().getValue(), LinkisUtils.getUserTicketKV(username)._2, serverUtils.getAccessAddress(), "/", new Date(System.currentTimeMillis() + 1000*60*60*24*30L)); + Cookie innerCookie = new Cookie("dataworkcloud_inner_request", "true", serverUtils.getAccessAddress(), "/", new Date(System.currentTimeMillis() + 1000*60*60*24*30L)); driver.manage().addCookie(ticketCookie); driver.manage().addCookie(innerCookie); driver.get(url); - - log.info("for user {} getting... {}", username, url); + log.info("getting... {}", url); try { - log.info("Start the screenshot and set the timeout value is {}", timeOutSecond); WebDriverWait wait = new WebDriverWait(driver, timeOutSecond); ExpectedCondition ConditionOfSign = ExpectedConditions.presenceOfElementLocated(By.id("headlessBrowserRenderSign")); ExpectedCondition ConditionOfWidth = ExpectedConditions.presenceOfElementLocated(By.id("width")); ExpectedCondition ConditionOfHeight = ExpectedConditions.presenceOfElementLocated(By.id("height")); - // WidgetExecuteFailedTag - ExpectedCondition ConditionOfWidgetExecuteFailedTag = - ExpectedConditions.presenceOfElementLocated(By.id("WidgetExecuteFailedTag")); - - wait.until(ExpectedConditions.or(ConditionOfSign, ConditionOfWidgetExecuteFailedTag, ConditionOfWidth, ConditionOfHeight)); - - WebElement widgetExecuteFailedTag = null; - widgetExecuteFailedTag = waitUntilElementInvisible(By.id("WidgetExecuteFailedTag"), driver); + wait.until(ExpectedConditions.or(ConditionOfSign, ConditionOfWidth, ConditionOfHeight)); - if (null == widgetExecuteFailedTag) { + String widthVal = driver.findElement(By.id("width")).getAttribute("value"); + String heightVal = driver.findElement(By.id("height")).getAttribute("value"); - String widthVal = driver.findElement(By.id("width")).getAttribute("value"); - String heightVal = driver.findElement(By.id("height")).getAttribute("value"); + int width = DEFAULT_SCREENSHOT_WIDTH; + int height = DEFAULT_SCREENSHOT_HEIGHT; - int width = DEFAULT_SCREENSHOT_WIDTH; - int height = DEFAULT_SCREENSHOT_HEIGHT; - - if (!StringUtils.isEmpty(widthVal)) { - log.info("Browser resolution width is {}", widthVal); - width = Integer.parseInt(widthVal); - } else { - log.info("The browser resolution width is the default: {}", width); - } + if (!StringUtils.isEmpty(widthVal)) { + width = Integer.parseInt(widthVal); + } - if (!StringUtils.isEmpty(heightVal)) { - log.info("Browser resolution height is {}", heightVal); - height = Integer.parseInt(heightVal); - } else { - log.info("The browser resolution height is the default: {}", height); - } - driver.manage().window().setSize(new Dimension(width, height)); - Thread.sleep(2000); - return ((TakesScreenshot) driver).getScreenshotAs(OutputType.FILE); - } else { - log.error("When the screenshot is taken, the widget execution fails and the widget WidgetExecuteFailedTag tag is captured!"); - return null; + if (!StringUtils.isEmpty(heightVal)) { + height = Integer.parseInt(heightVal); } + driver.manage().window().setSize(new Dimension(width, height)); + Thread.sleep(2000); + return ((TakesScreenshot) driver).getScreenshotAs(OutputType.FILE); } catch (InterruptedException e) { - log.error("do screenshot thread gets interrupted, ", e); + e.printStackTrace(); } finally { - log.info("for user {} finished getting {}, webdriver will quit soon", username, url); + log.info("finish get {}, webdriver will quit soon", url); driver.quit(); } return null; @@ -208,7 +180,6 @@ private WebDriver generateWebDriver() throws ExecutionException { driver.manage().timeouts().implicitlyWait(3, TimeUnit.MINUTES); driver.manage().window().maximize(); driver.manage().window().setSize(new Dimension(DEFAULT_SCREENSHOT_WIDTH, DEFAULT_SCREENSHOT_HEIGHT)); - return driver; } @@ -233,13 +204,6 @@ private WebDriver generateChromeDriver() throws ExecutionException { options.addArguments("silent"); options.addArguments("--disable-application-cache"); - options.addArguments("disable-dev-shm-usage"); - options.addArguments("remote-debugging-port=9012"); - -// options.addArguments("--no-sandbox"); -// options.addArguments("--disable-dev-shm-usage"); -// options.addArguments("--headless"); - return new ChromeDriver(options); } @@ -252,41 +216,7 @@ private WebDriver generatePhantomJsDriver() throws ExecutionException { } log.info("Generating PhantomJs driver ({})...", PHANTOMJS_PATH); System.setProperty(PhantomJSDriverService.PHANTOMJS_EXECUTABLE_PATH_PROPERTY, PHANTOMJS_PATH); - PhantomJSDriver phantomJSDriver = null; - try { - phantomJSDriver = new PhantomJSDriver(); - } catch (final Exception e) { - //初始化失败,需要进行下重试一次,如果两次都失败了,基本就证明本时段不可用 - log.warn("failed to new PhantomJSDriver, we will do it once again", e); - phantomJSDriver = new PhantomJSDriver(); - } - return phantomJSDriver; - } - private String getUrlWithEnv(String url) { - String env = CommonConfig.ACCESS_ENV().getValue(); - if (org.apache.commons.lang.StringUtils.isBlank(env)) { - return url; - } - url = url.replace("?", "?env=" + env + "&"); - return url; - } - - /** - * 缺陷记录: - * 之前使用widgetExecuteFailedTag = driver.findElement(By.id("WidgetExecuteFailedTag"));来获取失败元素, - * 通过判断widgetExecuteFailedTag是否为null,来判断是否执行成功,这样会导致一个问题。当执行成功是WidgetExecuteFailedTag对象并没有产生 - * 导致findElement会去长时间搜索该对象,截图操作效率降低很多,目前通过如下方法来设置超时 - * */ - private WebElement waitUntilElementInvisible(By element, WebDriver driver) { - driver.manage().timeouts().implicitlyWait(10, TimeUnit.SECONDS); - WebElement expectElement = null; - try { - expectElement = driver.findElement(element); - } catch (NoSuchElementException e) { - log.info("When the screenshot page is opened, the widget execution failure tag is not found. So Screenshot successful!"); - } - driver.manage().timeouts().implicitlyWait(timeOutSecond, TimeUnit.SECONDS); - return expectElement; + return new PhantomJSDriver(); } } diff --git a/server/src/main/resources/banner.txt b/server/src/main/resources/banner.txt index fcb98808c..bd4d26862 100644 --- a/server/src/main/resources/banner.txt +++ b/server/src/main/resources/banner.txt @@ -1,13 +1,7 @@ - ___ - ___ /\__\ _____ - /\ \ /:/ _/_ /::\ \ ___ - \:\ \ /:/ /\ \ /:/\:\ \ /\__\ - \:\ \ /:/ /::\ \ /:/ /::\__\ /:/__/ - ___ \:\__\/:/_/:/\:\__\/:/_/:/\:|__|/::\ \ -/\ \ |:| |\:\/:/ /:/ /\:\/:/ /:/ /\/\:\ \__ -\:\ \|:| | \::/ /:/ / \::/_/:/ / ~~\:\/\__\ - \:\__|:|__| \/_/:/ / \:\/:/ / \::/ / - \::::/__/ /:/ / \::/ / /:/ / - ~~~~ \/__/ \/__/ \/__/ - Visualis (VSBI) version: 1.0.0 - Spring Boot version: 2.3.7.RELEASE + ___ _ _ + | \ __ _ __ __(_) _ _ __ (_) + | |) |/ _` |\ V /| || ' \ / _|| | + |___/ \__,_| \_/ |_||_||_|\__||_| + + Davinci version: 0.3 + Spring Boot version: 2.0.4.RELEASE diff --git a/server/src/main/resources/mybatis/mapper/MemDashboardWidgetMapper.xml b/server/src/main/resources/mybatis/mapper/MemDashboardWidgetMapper.xml index ecb84a7cf..8a820ddfe 100644 --- a/server/src/main/resources/mybatis/mapper/MemDashboardWidgetMapper.xml +++ b/server/src/main/resources/mybatis/mapper/MemDashboardWidgetMapper.xml @@ -78,7 +78,7 @@ - + update mem_dashboard_widget `dashboard_id` = #{item.dashboardId,jdbcType=BIGINT}, diff --git a/server/src/main/resources/mybatis/mapper/ProjectMapper.xml b/server/src/main/resources/mybatis/mapper/ProjectMapper.xml index 8352356e2..8e8b6a6e7 100644 --- a/server/src/main/resources/mybatis/mapper/ProjectMapper.xml +++ b/server/src/main/resources/mybatis/mapper/ProjectMapper.xml @@ -25,7 +25,7 @@ SELECT LAST_INSERT_ID() AS id - insert visualis_project + insert dss_project `name`, @@ -139,13 +139,13 @@ u.`id` as 'createBy.id', IF(u.`name` is NULL, u.`username`, u.`name`) as 'createBy.username', u.`avatar` as 'createBy.avatar' - from visualis_project p - left join `visualis_user` u on u.`id` = p.`user_id` + from dss_project p + left join `linkis_user` u on u.`id` = p.`user_id` left join star s on (s.target_id = p.id and s.`target` = 'project' and s.user_id = #{userId}) - where p.isArchive != 1 and p.id in ( + where p.id in ( select DISTINCT p.id - from visualis_project p + from dss_project p left join rel_project_admin rpa on rpa.project_id = p.id where p.user_id = #{userId} or rpa.user_id = #{userId} @@ -153,7 +153,7 @@ select DISTINCT p.id - from visualis_project p + from dss_project p left join rel_role_project rrp on rrp.project_id = p.id left join rel_role_user rru on rru.role_id = rrp.role_id where rru.user_id = #{userId} @@ -163,7 +163,7 @@ select DISTINCT p.id - from visualis_project p + from dss_project p left join rel_user_organization ruo on ruo.org_id = p.org_id left join organization o on o.id = p.org_id where o.user_id = #{userId} @@ -173,7 +173,7 @@ select DISTINCT p.id - from visualis_project p + from dss_project p left join rel_user_organization ruo on ruo.org_id = p.org_id left join organization o on o.id = p.org_id where ruo.user_id = #{userId} and (ruo.role = 1 or (p.visibility = 1 and o.member_permission = 1)) @@ -225,13 +225,13 @@ diff --git a/server/src/main/resources/mybatis/mapper/UserMapper.xml b/server/src/main/resources/mybatis/mapper/UserMapper.xml index 8cff3aec1..d4d241ee0 100644 --- a/server/src/main/resources/mybatis/mapper/UserMapper.xml +++ b/server/src/main/resources/mybatis/mapper/UserMapper.xml @@ -25,7 +25,7 @@ SELECT LAST_INSERT_ID() AS id - insert `visualis_user` + insert `linkis_user` `email`, `username`, @@ -74,7 +74,7 @@ - - - - - update `view` - - - `name`=#{name}, - - - `description`=#{description}, - - - `project_id`=#{projectId}, - - - `source_id`=#{sourceId}, - - - `sql`=#{sql}, - - - `model`=#{model}, - - - `variable`=#{variable}, - - - `config`=#{config}, - - - `update_by`=#{updateBy}, - - - `update_time`=#{updateTime}, - - - \ No newline at end of file diff --git a/server/src/main/resources/templates/js/executeParam.js b/server/src/main/resources/templates/js/executeParam.js index 63003a1b5..68086f050 100644 --- a/server/src/main/resources/templates/js/executeParam.js +++ b/server/src/main/resources/templates/js/executeParam.js @@ -5308,7 +5308,6 @@ function getWidgetExecuteParam(widgetConfig) { var rows = _widgetConfig.rows; var metrics = _widgetConfig.metrics; var filters = _widgetConfig.filters; - var view = _widgetConfig.view; var color = _widgetConfig.color; var label = _widgetConfig.label; var size = _widgetConfig.size; @@ -5398,8 +5397,7 @@ function getWidgetExecuteParam(widgetConfig) { orders: orders, cache: cache, expired: expired, - nativeQuery: nativeQuery, - view: view + nativeQuery: nativeQuery }; return requestParams } diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/configuration/CommonConfig.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/configuration/CommonConfig.scala index 5e6363aa0..f4e4a9527 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/configuration/CommonConfig.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/configuration/CommonConfig.scala @@ -1,24 +1,42 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.configuration -import org.apache.linkis.common.conf.CommonVars - - +import com.webank.wedatasphere.linkis.common.conf.CommonVars +import sun.misc.BASE64Encoder +/** + * Created by allenlliu on 2019/1/26. + */ object CommonConfig { + val QUERY_PERSISTENCE_SPRING_APPLICATION_NAME = CommonVars("wds.dss.query.application.name", "cloud-query") + val QUERY_PERSISTENCE_SPRING_TIME = CommonVars("wds.dss.visualis.query.time,", 10000) + val ENGINE_DEFAULT_LIMIT = CommonVars("wds.dss.engine.default.limit", 5000) /** - * 接口越权检测 - * */ - val CHECK_PROJECT_USER = CommonVars("wds.dss.visualis.check.project.user", false) + this is the configuration to get the hive database source + */ + val GATEWAY_IP = CommonVars("wds.dss.visualis.gateway.ip", "") - val ENGINE_DEFAULT_LIMIT = CommonVars("wds.dss.engine.default.limit", 5000) + val GATEWAY_PORT = CommonVars("wds.dss.visualis.gateway.port", "") + + val GATEWAY_PROTOCOL = CommonVars("wds.dss.visualis.gateway.protocol", "http://") val DB_URL_SUFFIX = CommonVars("wds.dss.visualis.database.url", "/api/rest_j/v1/datasource/dbs") - /** - * Linkis换成Apache版本后,更换了ticket id - * key: wds.linkis.session.ticket.key - * linkis_user_session_ticket_id_v1 - * */ - val TICKET_ID_STRING = CommonVars("wds.dss.visualis.ticketid", "linkis_user_session_ticket_id_v1") + val TICKET_ID_STRING = CommonVars("wds.dss.visualis.ticketid", "bdp-user-ticket-id") val TABLE_URL_SUFFIX = CommonVars("wds.dss.visualis.table.url", "/api/rest_j/v1/datasource/tables") @@ -28,26 +46,4 @@ object CommonConfig { val HIVE_DATASOURCE_NAME = CommonVars("wds.dss.visualis.hive.datasource.name", "hive") - val RESULT_SET_SCHEMA = CommonVars("wds.dss.visualis.result.set.schema", "hdfs://") - - val DEFAULT_PROJECT_NAME = CommonVars("wds.dss.visualis.default.project.name", "默认可视化项目") - - val EXPORT_PROJECT_DIR = CommonVars("wds.dss.visualis.export.project.dir", "/data/dss/dss/visualis-server/userfiles/export/") - - val JDBC_CACHE_FLUSH_WRITE = CommonVars("wds.dss.visualis.jdbc.cache.flush.write", 30L) - - val DEPLOY_ENV = CommonVars("wds.dss.visualis.deploy.env", "DEV") - val ACCESS_ENV = CommonVars("wds.dss.visualis.access.env", "") - - val ENABLE_PASSWORD_ENCRYPT = CommonVars("wds.dss.visualis.enable.password.encrypt", false) - - val LINKIS_MYSQL_PUB_KEY = CommonVars("wds.linkis.mysql.pub.key", "MIIBIjANBgkqhki" + "G9w0BAQEFAAOCAQ8AMIIBCgKCAQEAvBulc+/VDSKuwMUdtrZ1vYm9FU64E4l5EOQ5LozdRZoAxv1nlwEMKW5crGHBA" + "T3rmdOOEHow67r55zjXks6mMDyuU+y32TWsphR6haUMsRcfeBWp5h3csQBaaDT2di2pL+rxMXvhodAoI9U1bSf4U5q8mcJn" + "Cln1twOUky3BCS8VH95QawHYvTe+1NINL+aJG3W4g9JfEwoFPnDOQHGFryotNMs1zZBt3PDyNsMrPloBVLFVUAT7RpmXkEmjfpqfzDvdO4F" + "9MSBan6sk2jQyWJUg4FKXgsqeXUz+OYpbNW8Dw0Q5E5JDGMtrx1kzX8mVooheoS7SpiPJsgi26FPSEwIDAQAB") - - val LINKIS_MYSQL_PRIV_KEY = CommonVars("wds.linkis.mysql.pri.key", "MIIEvAIBADANBgkqhkiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQC8G6Vz7" + "9UNIq7AxR22tnW9ib0VTrgTiXkQ5DkujN1FmgDG/WeXAQwpblysYcEBPeuZ044QejDruvnnONeS" + "zqYwPK5T7LfZNaymFHqFpQyxFx94FanmHdyxAFpoNPZ2Lakv6vExe+Gh0Cgj1TVtJ/hTmryZwmcKWfW3A5STLcEJ" + "LxUf3lBrAdi9N77U0g0v5okbdbiD0l8TCgU+cM5AcYWvKi00yzXNkG3c8PI2wys+WgFUsVVQBPtGmZeQSaN+mp/MO907gX" + "0xIFqfqyTaNDJYlSDgUpeCyp5dTP45ils1bwPDRDkTkkMYy2vHWTNfyZWiiF6hLtKmI8myCLboU9ITAgMBAAECggEANxpqJ" + "0I0SPrF8lZL1AAzEWjN6PX8WkzFGDuivI4rK35nh+Mne0alR2W65Axmu3RmFdOxJAaHWiaVmjQ+ghTi/fJoptELMifVAXmyQoAM7bt" + "2TnkaIfzRb1BJK4mIQSozC4RpTzOY7wvJFmYYlndE+Ui0wt39zTx5DDmSRmL6zzNoTG5pPgmN/zq2icbhXqD0DP8wxw4AFKJWdrc" + "MkkjRfKkByqA03bymfQqIz5uHril60o3xuuTyBPR74bnPdJE4ONahQHIgvWj/aQYqNyaapJJ8C194Acin0hl1QRv30+syM95QBdLEbLPAU" + "Ho2ClWRJZ6cqDPZe2a2N5YpbXtIQQKBgQD0iremr8O42yENqfAXK06Cek30E6wHTJ/RYipcveVg34rXsJ+yTls7SV6tyCqNknZzMoRw" + "QrIs74TN+KDF52wudrTpbsf5IKJQyExOz7LxTJ76h2OVA9zM1/MPtOHLt5mnwhLTtmxhVZ54CXbkw237pSagG+HhLyrO8S4mIwe" + "H8QKBgQDE7AEClojuj5cwRH46ic2s/oIuBObNFeJcRvxx+ONNdlOWOKRi6FhfHlhzLoFDUci2bjn1fvP1EMYZ+KkXATyezgIjJ" + "nnClXFpsORhUWh0SiqS3gVJeSIEDKeuh9esRPXk/cyPa3V8o5HouWWDitday0Xsnw51/sVTbN3b5z0eQwKBgEys9hqgv+jFZJ7JKwv" + "Iu2wz9x9Rz73WK8JWWlwL+tEeJoWsztX0taxoO/SXb6hGRTennlkowH9Qdr6yd462GniTJfSPlMorjllwBGUtwLjiQnLhYrsFpATi" + "rUa+e5IJtncgZhDWATOfyflvVkUyddjSlsLbGz8lL/IFM2gn0aOxAoGADJlQ4zqAXkr/kE4BiXtBlnzeFVWo8pwg1GiSRDR5Tn5wkJ7" + "lHZLh/IvzesMR8B2uasWYnbVWpGpDUmwPXXJtz3c8ucT/a0ymae2wXu2XckFAgg8EZZQDciDhJZB5YwMyfEkkqlRkuum4LxyVexo" + "J9zwkKCRxB2madGD1vNkJlwMCgYANF7fiG6k3D45Nopu8iTbi3S9oOnhTWxpwxSJWTUij4HFmtSXjJfgOPG9rVvO5QCaHWDWHE/LyZ" + "/Y51ustAV3uj5UmnGwXQDNEgNZUFe4vwzYq7ikXoE6zCTzs70DT/4llos5g1rs8feuWrJK19DPKrenxyOLI6pPA9GjMgC1aEg==") - - val ENABLE_JDBC_WHITELIST = CommonVars[Boolean]("wds.dss.visualis.enable.jdbc.whitelist", false) - - val JDBC_WHITELIST = CommonVars("wds.dss.visualis.jdbc.whitelist", "") - - val JDBC_ENCRYPT_PARAMETER = CommonVars("wds.dss.visualis.jdbc.encrypt.parameter", "encrypt=true") } diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/LinkisClientExecutor.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/LinkisClientExecutor.scala deleted file mode 100644 index 2f551948a..000000000 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/LinkisClientExecutor.scala +++ /dev/null @@ -1,358 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.entrance.spark - -import java.util -import java.util.concurrent.{Executors, TimeUnit} -import com.google.common.cache.CacheBuilder -import com.webank.wedatasphere.dss.visualis.entrance.spark.LinkisClientExecutor.linkisClient -import com.webank.wedatasphere.dss.visualis.exception.{ResultTypeException, SparkEngineExecuteException, VGErrorException} -import com.webank.wedatasphere.dss.visualis.model.PaginateWithExecStatus -import com.webank.wedatasphere.dss.visualis.res.ResultHelper -import com.webank.wedatasphere.dss.visualis.ujes.UJESJob -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils -import org.apache.linkis.common.conf.Configuration -import org.apache.linkis.common.io.FsPath -import org.apache.linkis.common.utils.{Logging, Utils} -import org.apache.linkis.cs.common.utils.CSCommonUtils -import org.apache.linkis.httpclient.dws.authentication.TokenAuthenticationStrategy -import org.apache.linkis.httpclient.dws.config.DWSClientConfigBuilder -import org.apache.linkis.protocol.constants.TaskConstant -import org.apache.linkis.protocol.utils.ZuulEntranceUtils -import org.apache.linkis.scheduler.queue.SchedulerEventState -import org.apache.linkis.storage.FSFactory -import org.apache.linkis.storage.resultset.table.{TableMetaData, TableRecord} -import org.apache.linkis.storage.resultset.{ResultSetFactory, ResultSetReader} -import org.apache.linkis.ujes.client.UJESClient -import org.apache.linkis.ujes.client.request.JobExecuteAction -import org.apache.linkis.ujes.client.response.{JobExecuteResult, JobInfoResult} -import edp.core.exception.{ServerException, SourceException} -import edp.core.model.{BaseSource, PaginateWithQueryColumns, QueryColumn} -import edp.core.utils.SqlUtils -import edp.davinci.model.Source -import org.apache.commons.lang.StringUtils -import org.apache.linkis.adapt.LinkisUtils -import org.json4s.DefaultFormats -import org.json4s.jackson.Serialization.read -import org.springframework.context.annotation.Scope -import org.springframework.jdbc.core.JdbcTemplate -import org.springframework.stereotype.Component - -import scala.collection.JavaConversions -import scala.collection.JavaConversions._ -import scala.math.BigDecimal.RoundingMode - - -@Component -@Scope("prototype") -class LinkisClientExecutor extends SqlUtils with Logging{ - - private var umUser:String="" - implicit val formats = DefaultFormats - - override def init(source: BaseSource): SqlUtils = { - if(source == null || VisualisUtils.isLinkisDataSource(source)){ - //info(s"SparkEntranceExecutor is initing, config is: ${source.asInstanceOf[Source].getConfig}") - val executor = new LinkisClientExecutor - executor.jdbcDataSource = this.jdbcDataSource - executor.jdbcUrl = this.jdbcUrl - executor.username = this.username - executor.password = this.password - if(source.isInstanceOf[Source]){ - val config = LinkisUtils.gson.fromJson(source.asInstanceOf[Source].getConfig,classOf[util.Map[String,Any]]) - executor.isSchedulerTask = if(null == config.get("isSchedulerTask")) true else config.get("isSchedulerTask").asInstanceOf[Boolean] - } - executor - } else super.init(source) - } - - override def init(jdbcUrl: String, username: String, password: String, dbVersion: String, ext: Boolean): SqlUtils = { - val source = new BaseSource { - override def getJdbcUrl: String = jdbcUrl - override def getUsername: String = username - override def getPassword: String = password - override def getDatabase: String = "" - override def getDbVersion: String = dbVersion - override def isExt: Boolean = ext - } - init(source) - } - - private def executeUntil[T](sql: String, op: JobExecuteResult => T): T = { - info(s"$umUser began to executeRealJob script:$sql") - val jobExecuteResult: JobExecuteResult = submitQuery(sql) - var jobInfoResult = linkisClient.getJobInfo(jobExecuteResult) - var status = SchedulerEventState.withName(jobInfoResult.getRequestPersistTask.getStatus) - while(!SchedulerEventState.isCompleted(status)) { - Utils.sleepQuietly(500) - jobInfoResult = linkisClient.getJobInfo(jobExecuteResult) - status = SchedulerEventState.withName(jobInfoResult.getRequestPersistTask.getStatus) - } - if(!SchedulerEventState.isSucceed(status)){ - val jobInfo = linkisClient.getJobInfo(jobExecuteResult) - val requestPersistTask = jobInfo.getRequestPersistTask - if(requestPersistTask.getErrCode != null && requestPersistTask.getErrDesc != null){ - throw SparkEngineExecuteException(requestPersistTask.getErrCode, "spark engine run sql failed:" + requestPersistTask.getErrDesc) - } - throw SparkEngineExecuteException(60001, "spark engine run sql failed") - } - info(s"$umUser finish to executeRealJob script:$sql") - return op(jobExecuteResult) - } - - def submitQuery(sql: String): JobExecuteResult = { - val input = read[UJESJob](sql) - var code = input.code - val jobType = input.jobType - val source = JavaConversions.mapAsJavaMap(input.source.asInstanceOf[Map[String, Any]]) - umUser = input.user - val params = new util.HashMap[String, Any]() - val configuration = new util.HashMap[String, Any]() - val runtime = new util.HashMap[String, Any]() - // updated cache related params - if(StringUtils.isNotBlank(input.nodeName)){ - runtime.put(CSCommonUtils.NODE_NAME_STR, input.nodeName) - } - if(StringUtils.isNotBlank(input.contextId)){ - runtime.put(CSCommonUtils.CONTEXT_ID_STR, input.contextId) - } - runtime.put(TaskConstant.CACHE, input.cache.asInstanceOf[java.lang.Boolean]) - runtime.put(TaskConstant.CACHE_EXPIRE_AFTER, input.cacheExpireAfter.asInstanceOf[java.lang.Long]) - runtime.put(TaskConstant.READ_FROM_CACHE, input.readFromCache.asInstanceOf[java.lang.Boolean]) - runtime.put(TaskConstant.READ_CACHE_BEFORE, input.readCacheBefore.asInstanceOf[java.lang.Long]) - configuration.put(TaskConstant.PARAMS_CONFIGURATION_RUNTIME, runtime) - params.put(TaskConstant.PARAMS_CONFIGURATION, configuration) - // if (jobType.equals(UJESJob.SQL_TYPE)) { - // code = SqlUtils.filterAnnotate(code) - // SqlUtils.checkSensitiveSql(code) - // } - val builder = JobExecuteAction.builder() - .setCreator(VisualisUtils.VG_CREATOR.getValue) - .addExecuteCode(code) - .setEngineTypeStr(input.engine) - .setRunTypeStr(jobType) - .setSource(source) - .setUser(umUser) - .setParams(params) - - val jobExecuteResult = linkisClient.execute(builder.build()) - jobExecuteResult - } - - /** - * 将一个结果集文件或结果集解析成一个List - * @param resultSet 结果集文件或结果集 - * @return - */ - private def getResultSet(resultSet: String): util.List[util.Map[String, AnyRef]] = { - info(s"$umUser began to get the result of execution :$resultSet") - val rsFactory= ResultSetFactory.getInstance - val res = new util.ArrayList[util.Map[String, AnyRef]]() - if(rsFactory.isResultSet(resultSet)){ - val resultSetModel = rsFactory.getResultSetByContent(resultSet) - if(ResultSetFactory.TABLE_TYPE != resultSetModel.resultSetType()){ - throw new VGErrorException(60013,"不支持不是表格的结果集") - } - val reader =ResultSetReader.getResultSetReader(resultSetModel,resultSet) - val metaData = reader.getMetaData.asInstanceOf[TableMetaData] - while (reader.hasNext) { - val record = reader.getRecord.asInstanceOf[TableRecord] - val lineMap = new util.LinkedHashMap[String, AnyRef]() - val columns = metaData.columns - val row = record.row - if (columns.size > 0) { - for (x <- 0 until columns.size) { - lineMap.put(columns(x).columnName, parseValue(row(x))) - } - } - res.add(lineMap) - } - Utils.tryQuietly(reader.close()) - //info(s"$umUser finish to get the result of execution :$resultSet") - res - }else if(rsFactory.isResultSetPath(resultSet)){ - val resPath = new FsPath(ResultHelper.getSchemaPath(resultSet)) - val resultSetContent = rsFactory.getResultSetByPath(resPath) - if(ResultSetFactory.TABLE_TYPE != resultSetContent.resultSetType()){ - throw new VGErrorException(60014,"不支持不是表格的结果集") - } - val fs = FSFactory.getFs(resPath) - fs.init(null) - val reader =ResultSetReader.getResultSetReader(resultSetContent,fs.read(resPath)) - val metaData = reader.getMetaData.asInstanceOf[TableMetaData] - while (reader.hasNext){ - val record = reader.getRecord.asInstanceOf[TableRecord] - val lineMap = new util.LinkedHashMap[String,AnyRef]() - val columns = metaData.columns - val row = record.row - if(columns.size>0) { - for (x <- 0 until columns.size) { - lineMap.put(columns(x).columnName, parseValue(row(x))) - } - } - res.add(lineMap) - } - Utils.tryQuietly(reader.close()) - Utils.tryQuietly(fs.close()) - //info(s"$umUser finish to get the result of execution :$resultSet") - res - }else{ - throw new ResultTypeException(60015,"结果集类型异常:"+resultSet) - } - } - - private def parseValue(original: Any) : AnyRef = { - original match { - case bigDecimal: BigDecimal => bigDecimal.toDouble.asInstanceOf[AnyRef] - case bool: Boolean => bool.toString - case boolean: java.lang.Boolean => boolean.toString - case _ => original.asInstanceOf[AnyRef] - } - } - - private def querySQLWithJobExecuteResult(sql: String, limit: Int): JobExecuteResult = { - executeUntil(sql, j => j) - } - - - override def execute(sql: String): Unit = { - executeUntil(sql, _ => ()) - } - - override def query4List(sql: String, limit: Int): util.List[util.Map[String, AnyRef]] = { - val jobExecuteResult = querySQLWithJobExecuteResult(sql, limit) - val jobInfo = linkisClient.getJobInfo(jobExecuteResult) - getResultSet(VisualisUtils.getResultSetPath(jobInfo)) - } - - override def submit4Exec(sql: String, pageNo: Int, pageSize: Int, totalCount: Int, limit: Int, excludeColumns: util.Set[String]): PaginateWithExecStatus = { - val paginateWithQueryColumns = new PaginateWithExecStatus - val jobExecuteResult = submitQuery(sql) - paginateWithQueryColumns.setExecId(jobExecuteResult.getTaskID()) - paginateWithQueryColumns.setProgress(0.0f) - return paginateWithQueryColumns; - } - - override def getProgress4Exec(execId: String, user: String): PaginateWithExecStatus = { - val jobExecuteResult = LinkisClientExecutor.getJobExecuteResult(execId, user) - val jobInfoResult = linkisClient.getJobInfo(jobExecuteResult) - val paginateWithQueryColumns = new PaginateWithExecStatus - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(BigDecimal(jobInfoResult.getRequestPersistTask.getProgress).setScale(2, RoundingMode.HALF_UP).floatValue()) - paginateWithQueryColumns.setStatus(jobInfoResult.getRequestPersistTask.getStatus) - return paginateWithQueryColumns - } - - override def kill4Exec(execId: String, user: String): PaginateWithExecStatus = { - val jobExecuteResult = LinkisClientExecutor.getJobExecuteResult(execId, user) - var jobInfoResult = LinkisClientExecutor.jobInfoCache.getIfPresent(execId) - if(null == jobInfoResult){ - jobInfoResult = linkisClient.getJobInfo(jobExecuteResult) - } - val status = SchedulerEventState.withName(jobInfoResult.getRequestPersistTask.getStatus) - if(SchedulerEventState.isCompleted(status)){ - LinkisClientExecutor.saveJobExecuteResult(execId, jobInfoResult) - } else { - jobExecuteResult.setExecID(jobInfoResult.getTask.get("strongerExecId").toString) - linkisClient.kill(jobExecuteResult) - } - val paginateWithQueryColumns = new PaginateWithExecStatus - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(BigDecimal(jobInfoResult.getRequestPersistTask.getProgress).setScale(2, RoundingMode.HALF_UP).floatValue()) - paginateWithQueryColumns.setStatus(jobInfoResult.getRequestPersistTask.getStatus) - return paginateWithQueryColumns - } - - override def getResultSet4Exec(execId: String, user: String): PaginateWithExecStatus = { - val jobExecuteResult = LinkisClientExecutor.getJobExecuteResult(execId, user) - var jobInfoResult = linkisClient.getJobInfo(jobExecuteResult) - jobExecuteResult.setExecID(jobInfoResult.getRequestPersistTask.getExecId) - var status = SchedulerEventState.withName(jobInfoResult.getRequestPersistTask.getStatus) - while(!SchedulerEventState.isCompleted(status)) { - Utils.sleepQuietly(500) - jobInfoResult = linkisClient.getJobInfo(jobExecuteResult) - status = SchedulerEventState.withName(jobInfoResult.getRequestPersistTask.getStatus) - } - val paginateWithQueryColumns = new PaginateWithExecStatus - jobInfoResult = linkisClient.getJobInfo(jobExecuteResult) - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(BigDecimal(jobInfoResult.getRequestPersistTask.getProgress).setScale(2, RoundingMode.HALF_UP).floatValue()) - paginateWithQueryColumns.setStatus(jobInfoResult.getRequestPersistTask.getStatus) - - val jobInfo = linkisClient.getJobInfo(jobExecuteResult) - val resultList = getResultSet(VisualisUtils.getResultSetPath(jobInfo)) - paginateWithQueryColumns.setResultList(resultList) - paginateWithQueryColumns.setTotalCount(resultList.size()) - val columns = ResultHelper.getResultType(VisualisUtils.getResultSetPath(jobInfo)) - paginateWithQueryColumns.setColumns(columns.map(col => new QueryColumn(col.columnName,col.dataType.typeName)).toList) - return paginateWithQueryColumns - } - - override def query4Paginate(sql: String, pageNo: Int, pageSize: Int, totalCount: Int, limit: Int, excludeColumns: util.Set[String]): PaginateWithQueryColumns = { - val paginateWithQueryColumns = new PaginateWithQueryColumns - val jobExecuteResult = querySQLWithJobExecuteResult(sql, limit) - val jobInfo = linkisClient.getJobInfo(jobExecuteResult) - val resultList = getResultSet(VisualisUtils.getResultSetPath(jobInfo)) - paginateWithQueryColumns.setResultList(resultList) - paginateWithQueryColumns.setTotalCount(resultList.size()) - - val columns = ResultHelper.getResultType(VisualisUtils.getResultSetPath(jobInfo)) - paginateWithQueryColumns.setColumns(columns.map(col => new QueryColumn(col.columnName,col.dataType.typeName)).toList) - return paginateWithQueryColumns - } - - /** - * 判断表是否存在 - * - * @param tableName - * @return - * @throws SourceException - */ - override def tableIsExist(tableName: String): Boolean = super.tableIsExist(tableName) - - /** - * 根据sql查询列 - * - * @param sql - * @return - * @throws ServerException - */ - override def getColumns(sql: String): util.List[QueryColumn] = { - val jobExecuteResult = querySQLWithJobExecuteResult(sql, 2) - val jobInfo = linkisClient.getJobInfo(jobExecuteResult) - val columns = ResultHelper.getResultType(VisualisUtils.getResultSetPath(jobInfo)) - columns.map(col => new QueryColumn(col.columnName,col.dataType.typeName)).toList - } - - override def testConnection(): Boolean = super.testConnection() - - override def jdbcTemplate(): JdbcTemplate = super.jdbcTemplate() - - override def executeBatch(sql: String, headers: util.Set[QueryColumn], datas: util.List[util.Map[String, AnyRef]]): Unit = super.executeBatch(sql, headers, datas) - -} - -object LinkisClientExecutor{ - val clientConfig = DWSClientConfigBuilder.newBuilder().addServerUrl(Configuration.getGateWayURL()) - .connectionTimeout(30000).discoveryEnabled(false) - .maxConnectionSize(1000) - .retryEnabled(false).readTimeout(30000) - .setAuthenticationStrategy(new TokenAuthenticationStrategy()).setAuthTokenKey("dss-AUTH") - .setAuthTokenValue("dss-AUTH").setDWSVersion("v1").build() - val linkisClient = UJESClient(clientConfig) - - def getJobExecuteResult(taskId: String, user: String) = { - val jobExecuteResult = new JobExecuteResult - jobExecuteResult.setUser(user) - jobExecuteResult.setTaskID(taskId) - jobExecuteResult - } - - val jobInfoCache = CacheBuilder - .newBuilder - .expireAfterWrite(5, TimeUnit.MINUTES) - .maximumSize(5000) - .build[String, JobInfoResult] - - def saveJobExecuteResult(execId: String, jobInfoResult: JobInfoResult) = { - jobInfoCache.put(execId, jobInfoResult) - } -} diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SparkEntranceExecutor.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SparkEntranceExecutor.scala index a66d664af..f9daa924d 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SparkEntranceExecutor.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SparkEntranceExecutor.scala @@ -1,55 +1,82 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.entrance.spark import java.util +import java.util.concurrent.ConcurrentHashMap + import com.google.gson.reflect.TypeToken +import com.webank.wedatasphere.linkis.common.io.FsPath +import com.webank.wedatasphere.linkis.common.utils.{Logging, Utils} +import com.webank.wedatasphere.linkis.entrance.EntranceServer +import com.webank.wedatasphere.linkis.entrance.exception.{EntranceRPCException, QueryFailedException} +import com.webank.wedatasphere.linkis.protocol.constants.TaskConstant +import com.webank.wedatasphere.linkis.protocol.query.{RequestPersistTask, RequestQueryTask, ResponsePersist} +import com.webank.wedatasphere.linkis.rpc.Sender +import com.webank.wedatasphere.linkis.scheduler.queue.SchedulerEventState +import com.webank.wedatasphere.linkis.server.{BDPJettyServerHelper, JMap} +import com.webank.wedatasphere.linkis.server.security.SecurityFilter +import com.webank.wedatasphere.linkis.storage.FSFactory +import com.webank.wedatasphere.linkis.storage.resultset.table.{TableMetaData, TableRecord} +import com.webank.wedatasphere.linkis.storage.resultset.{ResultSetFactory, ResultSetReader} import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig import com.webank.wedatasphere.dss.visualis.exception.{ResultTypeException, SparkEngineExecuteException, VGErrorException} -import com.webank.wedatasphere.dss.visualis.model.PaginateWithExecStatus import com.webank.wedatasphere.dss.visualis.res.ResultHelper import com.webank.wedatasphere.dss.visualis.ujes.UJESJob import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils -import org.apache.linkis.common.io.FsPath -import org.apache.linkis.common.utils.{Logging, Utils} -import org.apache.linkis.cs.common.utils.CSCommonUtils -import org.apache.linkis.entrance.EntranceServer -import org.apache.linkis.entrance.exception.{EntranceRPCException, QueryFailedException} -import org.apache.linkis.governance.common.entity.task.{RequestPersistTask, RequestQueryTask} -import org.apache.linkis.protocol.constants.TaskConstant -import org.apache.linkis.rpc.Sender -import org.apache.linkis.scheduler.queue.SchedulerEventState -import org.apache.linkis.server.JMap -import org.apache.linkis.server.security.SecurityFilter -import org.apache.linkis.storage.FSFactory -import org.apache.linkis.storage.resultset.table.{TableMetaData, TableRecord} -import org.apache.linkis.storage.resultset.{ResultSetFactory, ResultSetReader} +import com.webank.wedatasphere.linkis.adapt.LinkisUtils import edp.core.exception.{ServerException, SourceException} import edp.core.model._ import edp.core.utils.SqlUtils import edp.davinci.model.Source -import org.apache.commons.lang.StringUtils -import org.apache.linkis.adapt.LinkisUtils import org.json4s._ import org.json4s.jackson.Serialization.read import org.springframework.beans.factory.annotation.Autowired -import org.springframework.context.annotation.Scope -import org.springframework.jdbc.core.JdbcTemplate +import org.springframework.context.annotation.{Scope, ScopedProxyMode} import org.springframework.stereotype.Component import org.springframework.web.context.request.{RequestContextHolder, ServletRequestAttributes} -import scala.collection.JavaConversions import scala.collection.JavaConversions._ -import scala.math.BigDecimal.RoundingMode +import org.json4s.JsonDSL._ +import org.json4s.jackson.JsonMethods._ +import org.json4s.jackson.Serialization._ +import org.json4s._ +import org.json4s.jackson._ +import org.json4s.jackson.Serialization.{read, write} +import org.springframework.context.annotation.Scope +import org.springframework.jdbc.core.JdbcTemplate +import org.springframework.web.context.WebApplicationContext +import scala.collection.JavaConversions +/** + * Created by shanhuang on 2019/1/23. + */ +@Component +@Scope("prototype") class SparkEntranceExecutor extends SqlUtils with Logging{ private var umUser:String="" implicit val formats = DefaultFormats @Autowired - var entranceServer: EntranceServer = _ + private var entranceServer: EntranceServer = _ override def init(source: BaseSource): SqlUtils = { - if(source == null || VisualisUtils.isLinkisDataSource(source)){ + if(VisualisUtils.isHiveDataSource(source)){ //info(s"SparkEntranceExecutor is initing, config is: ${source.asInstanceOf[Source].getConfig}") val executor = new SparkEntranceExecutor executor.jdbcDataSource = this.jdbcDataSource @@ -79,13 +106,30 @@ class SparkEntranceExecutor extends SqlUtils with Logging{ private def executeUntil[T](sql: String, op: VisualisJob => T): T = { info(s"$umUser began to executeRealJob script:$sql") - val execId: String = submitQuery(sql) - //SparkEntranceExecutor.putJobCache(umUser,execId)//缓存相应的执行ID + val input = read[UJESJob](sql) + var code =input.code + val jobType = input.jobType + val source = JavaConversions.mapAsJavaMap(input.source.asInstanceOf[Map[String, String]]) + umUser = input.user + if(jobType.equals(UJESJob.SQL_TYPE)) { + code = SqlUtils.filterAnnotate(code) + SqlUtils.checkSensitiveSql(code) + } + val requestMap = new JMap[String, Any] + requestMap.put(TaskConstant.UMUSER, umUser) + requestMap.put(TaskConstant.REQUESTAPPLICATIONNAME, VisualisUtils.VG_CREATOR.getValue) + requestMap.put(TaskConstant.EXECUTEAPPLICATIONNAME, VisualisUtils.VG_APP_NAME.getValue) + requestMap.put(TaskConstant.EXECUTIONCODE, code) + requestMap.put(TaskConstant.RUNTYPE,jobType) + requestMap.put(TaskConstant.SOURCE,source) + requestMap.put(TaskConstant.PARAMS,new util.HashMap()) + val execId = entranceServer.execute(requestMap) + SparkEntranceExecutor.putJobCache(umUser,execId)//缓存相应的执行ID entranceServer.getJob(execId) foreach { case job: VisualisJob => job.waitForCompleted() if (!SchedulerEventState.isSucceed(job.getState)){ - job.getJobRequest match { + job.getTask match { case t: RequestPersistTask => if(t.getErrCode != null && t.getErrDesc != null){ throw SparkEngineExecuteException(t.getErrCode, "spark engine run sql failed:" + t.getErrDesc) @@ -98,48 +142,59 @@ class SparkEntranceExecutor extends SqlUtils with Logging{ throw SparkEngineExecuteException(60001, "spark engine run sql failed") } info(s"$umUser finish to executeRealJob script:$sql") - //SparkEntranceExecutor.removeJobCache(umUser,execId) + SparkEntranceExecutor.removeJobCache(umUser,execId) return op(job) case _ => - //SparkEntranceExecutor.removeJobCache(umUser,execId) + SparkEntranceExecutor.removeJobCache(umUser,execId) throw new VGErrorException(70001, "executeRealJob failed, not supported job type.") } throw new VGErrorException(70001, s"executeRealJob failed, cannot find the job $execId.") } - def submitQuery(sql: String) = { - val input = read[UJESJob](sql) - var code = input.code - val jobType = input.jobType - val source = JavaConversions.mapAsJavaMap(input.source.asInstanceOf[Map[String, String]]) - umUser = input.user -// if (jobType.equals(UJESJob.SQL_TYPE)) { -// code = SqlUtils.filterAnnotate(code) -// SqlUtils.checkSensitiveSql(code) -// } - val requestMap = new JMap[String, Any] - requestMap.put(TaskConstant.UMUSER, umUser) - requestMap.put(TaskConstant.REQUESTAPPLICATIONNAME, VisualisUtils.VG_CREATOR.getValue)// input.creator) - requestMap.put(TaskConstant.EXECUTEAPPLICATIONNAME, input.engine) - requestMap.put(TaskConstant.EXECUTIONCODE, code) - requestMap.put(TaskConstant.RUNTYPE, jobType) - requestMap.put(TaskConstant.SOURCE, source) - val params = new util.HashMap[String, Object]() - val configuration = new util.HashMap[String, Object]() - val runtime = new util.HashMap[String, Object]() - // updated cache related params - runtime.put(TaskConstant.CACHE, input.cache.asInstanceOf[java.lang.Boolean]) - runtime.put(TaskConstant.CACHE_EXPIRE_AFTER, input.cacheExpireAfter.asInstanceOf[java.lang.Long]) - runtime.put(TaskConstant.READ_FROM_CACHE, input.readFromCache.asInstanceOf[java.lang.Boolean]) - runtime.put(TaskConstant.READ_CACHE_BEFORE, input.readCacheBefore.asInstanceOf[java.lang.Long]) - if(StringUtils.isNotBlank(input.contextId)){ - runtime.put(CSCommonUtils.CONTEXT_ID_STR, input.contextId) + private def getHistoryQuery(sql: String): Option[RequestPersistTask] = { + val ujesJob = read[UJESJob](sql) + val code = ujesJob.code + val sender = Sender.getSender(CommonConfig.QUERY_PERSISTENCE_SPRING_APPLICATION_NAME.getValue) + val task = new RequestQueryTask + // task.setUmUser(getCurrentUser) + task.setUmUser(ujesJob.user) + task.setExecutionCode(code) + var responsePersist:ResponsePersist = null + Utils.tryThrow( + responsePersist = sender.ask(task).asInstanceOf[ResponsePersist] + )(e => { + val errorException = new EntranceRPCException(60010, "sender rpc failed") + errorException.initCause(e) + throw errorException + }) + if (responsePersist != null){ + val status = responsePersist.getStatus() + val message = responsePersist.getMsg() + if (status != 0 ){ + throw new QueryFailedException(60011, "insert task failed, reason: " + message) + } + val data = responsePersist.getData() + val inputJson=LinkisUtils.gson.toJson(data.get(TaskConstant.TASK)) + // val extractJson = parse(inputJson) + val typeToken = new TypeToken[java.util.List[RequestPersistTask]](){}.getType + val ResponseRequestPersistTasks = LinkisUtils.gson.fromJson(inputJson,typeToken).asInstanceOf[java.util.List[RequestPersistTask]] + // val ResponseRequestPersistTasks = extractJson.extract[List[RequestPersistTask]] + if (ResponseRequestPersistTasks == null){ + throw new QueryFailedException(60012, "query task failed, reason: " + message) + } + val now =System.currentTimeMillis() + val sortedTasks = ResponseRequestPersistTasks.filter( x=>{ + now - x.getCreatedTime.getTime < CommonConfig.QUERY_PERSISTENCE_SPRING_TIME.getValue + }).sortWith((x,y)=>x.getCreatedTime.getTime > y.getCreatedTime.getTime) + if(sortedTasks.size >0){ + Some(sortedTasks(0)) + }else{ + None + } + }else{ + None } - configuration.put(TaskConstant.PARAMS_CONFIGURATION_RUNTIME, runtime) - params.put(TaskConstant.PARAMS_CONFIGURATION, configuration) - requestMap.put(TaskConstant.PARAMS, params) - val execId = entranceServer.execute(requestMap) - execId + } /** @@ -170,11 +225,10 @@ class SparkEntranceExecutor extends SqlUtils with Logging{ } res.add(lineMap) } - Utils.tryQuietly(reader.close()) - //info(s"$umUser finish to get the result of execution :$resultSet") + info(s"$umUser finish to get the result of execution :$resultSet") res }else if(rsFactory.isResultSetPath(resultSet)){ - val resPath = new FsPath(ResultHelper.getSchemaPath(resultSet)) + val resPath = new FsPath(resultSet) val resultSetContent = rsFactory.getResultSetByPath(resPath) if(ResultSetFactory.TABLE_TYPE != resultSetContent.resultSetType()){ throw new VGErrorException(60014,"不支持不是表格的结果集") @@ -195,9 +249,7 @@ class SparkEntranceExecutor extends SqlUtils with Logging{ } res.add(lineMap) } - Utils.tryQuietly(reader.close()) - Utils.tryQuietly(fs.close()) - //info(s"$umUser finish to get the result of execution :$resultSet") + info(s"$umUser finish to get the result of execution :$resultSet") res }else{ throw new ResultTypeException(60015,"结果集类型异常:"+resultSet) @@ -224,7 +276,7 @@ class SparkEntranceExecutor extends SqlUtils with Logging{ // } // }.filter(_.isEmpty).getOrElse( executeUntil(sql, { job => - job.getJobRequest match { + job.getTask match { case t: RequestPersistTask => Array(t.getResultLocation + VisualisUtils.RESULT_FILE_NAME.getValue) case _ => Array() } @@ -253,131 +305,14 @@ class SparkEntranceExecutor extends SqlUtils with Logging{ if(resultSets.isEmpty) new util.ArrayList[util.Map[String, AnyRef]] else getResultSet(resultSets(resultSets.length - 1)) } - override def submit4Exec(sql: String, pageNo: Int, pageSize: Int, totalCount: Int, limit: Int, excludeColumns: util.Set[String]): PaginateWithExecStatus = { - val paginateWithQueryColumns = new PaginateWithExecStatus - val execId = submitQuery(sql) - paginateWithQueryColumns.setExecId(VisualisUtils.getHAExecId(execId)) - paginateWithQueryColumns.setProgress(0.0f) - return paginateWithQueryColumns; - } - - override def getProgress4Exec(execId: String, user: String): PaginateWithExecStatus = { - val instance = VisualisUtils.getInstanceByHAExecId(execId) - val realExecId = VisualisUtils.getExecIdByHAExecId(execId) - if(instance.equals(Sender.getThisInstance)){ - entranceServer.getJob(realExecId) foreach { - case job: VisualisJob => - val paginateWithQueryColumns = new PaginateWithExecStatus - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(BigDecimal(job.getProgress).setScale(2, RoundingMode.HALF_UP).floatValue()) - paginateWithQueryColumns.setStatus(job.getState.toString) - return paginateWithQueryColumns - case _ => - throw new VGErrorException(70001, "executeRealJob failed, not supported job type.") - } - throw new VGErrorException(70001, s"executeRealJob failed, cannot find the job $execId.") - } else { - val task = VisualisUtils.getQueryTask(instance, realExecId) - val paginateWithQueryColumns = new PaginateWithExecStatus - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(BigDecimal(task.getProgress).setScale(2, RoundingMode.HALF_UP).floatValue()) - paginateWithQueryColumns.setStatus(task.getStatus) - return paginateWithQueryColumns - } - } - - override def kill4Exec(execId: String, user: String): PaginateWithExecStatus = { - val instance = VisualisUtils.getInstanceByHAExecId(execId) - val realExecId = VisualisUtils.getExecIdByHAExecId(execId) - if(instance.equals(Sender.getThisInstance)){ - entranceServer.getJob(realExecId) foreach { - case job: VisualisJob => - if(!SchedulerEventState.isCompleted(job.getState)){ - job.kill(); - } - val paginateWithQueryColumns = new PaginateWithExecStatus - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(job.getProgress) - paginateWithQueryColumns.setStatus(job.getState.toString) - return paginateWithQueryColumns - case _ => - throw new VGErrorException(70001, "kill job failed, not supported job type.") - } - throw new VGErrorException(70001, s"kill job failed, cannot find the job $execId.") - } else { - VisualisUtils.killHA(instance, execId) - } - } - - override def getResultSet4Exec(execId: String, user: String): PaginateWithExecStatus = { - val instance = VisualisUtils.getInstanceByHAExecId(execId) - val realExecId = VisualisUtils.getExecIdByHAExecId(execId) - val paginateWithQueryColumns = new PaginateWithExecStatus - if(instance.equals(Sender.getThisInstance)){ - entranceServer.getJob(realExecId) foreach { - case job: VisualisJob => - job.waitForCompleted() - - //TODO consider if necessary here - if (!SchedulerEventState.isSucceed(job.getState)){ - job.getJobRequest match { - case t: RequestPersistTask => - if(t.getErrCode != null && t.getErrDesc != null){ - throw SparkEngineExecuteException(t.getErrCode, "spark engine run sql failed:" + t.getErrDesc) - } - case _ => - } - if(job.getErrorResponse != null){ - throw SparkEngineExecuteException(60001, job.getErrorResponse.message) - } - throw SparkEngineExecuteException(60001, "spark engine run sql failed") - } - - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(job.getProgress) - paginateWithQueryColumns.setStatus(job.getState.toString) - val resultSets = job.getResultSets - if(resultSets.isEmpty){ - paginateWithQueryColumns.setResultList(new util.ArrayList[util.Map[String, AnyRef]]) - paginateWithQueryColumns.setTotalCount(0) - } else { - val resultList = getResultSet(resultSets(resultSets.length - 1)) - paginateWithQueryColumns.setResultList(resultList) - paginateWithQueryColumns.setTotalCount(resultList.size()) - val columns = ResultHelper.getResultType(resultSets(resultSets.length - 1)) - paginateWithQueryColumns.setColumns(columns.map(col => new QueryColumn(col.columnName,col.dataType.typeName)).toList) - } - return paginateWithQueryColumns; - case _ => - //SparkEntranceExecutor.removeJobCache(umUser,execId) - throw new VGErrorException(70001, "executeRealJob failed, not supported job type.") - } - throw new VGErrorException(70001, s"executeRealJob failed, cannot find the job $execId.") - } else { - val task = VisualisUtils.getQueryTask(instance, realExecId) - paginateWithQueryColumns.setExecId(execId) - paginateWithQueryColumns.setProgress(task.getProgress) - paginateWithQueryColumns.setStatus(task.getStatus) - val resultList = getResultSet(task.getResultLocation + VisualisUtils.RESULT_FILE_NAME.getValue) - paginateWithQueryColumns.setResultList(resultList) - paginateWithQueryColumns.setTotalCount(resultList.size()) - val columns = ResultHelper.getResultType(task.getResultLocation + VisualisUtils.RESULT_FILE_NAME.getValue) - paginateWithQueryColumns.setColumns(columns.map(col => new QueryColumn(col.columnName,col.dataType.typeName)).toList) - return paginateWithQueryColumns - } - } - override def query4Paginate(sql: String, pageNo: Int, pageSize: Int, totalCount: Int, limit: Int, excludeColumns: util.Set[String]): PaginateWithQueryColumns = { val paginateWithQueryColumns = new PaginateWithQueryColumns val resultSets = querySQLWithResultSetPaths(sql, limit) if(resultSets.isEmpty){ paginateWithQueryColumns.setResultList(new util.ArrayList[util.Map[String, AnyRef]]) - paginateWithQueryColumns.setTotalCount(0) } else { - val resultList = getResultSet(resultSets(resultSets.length - 1)) - paginateWithQueryColumns.setResultList(resultList) - paginateWithQueryColumns.setTotalCount(resultList.size()) + paginateWithQueryColumns.setResultList(getResultSet(resultSets(resultSets.length - 1))) val columns = ResultHelper.getResultType(resultSets(resultSets.length - 1)) paginateWithQueryColumns.setColumns(columns.map(col => new QueryColumn(col.columnName,col.dataType.typeName)).toList) } @@ -423,17 +358,17 @@ class SparkEntranceExecutor extends SqlUtils with Logging{ } object SparkEntranceExecutor{ -// val QUERY_JOB_CACHE = new ConcurrentHashMap[String, String]() -// -// def putJobCache(user:String,execId:String): Unit ={ -// QUERY_JOB_CACHE.put(user,execId) -// } -// -// def removeJobCache(user:String,execId:String):Unit = { -// QUERY_JOB_CACHE.remove(user, execId) -// } -// -// def getJobCache(user:String):String = { -// QUERY_JOB_CACHE.get(user) -// } + val QUERY_JOB_CACHE = new ConcurrentHashMap[String, String]() + + def putJobCache(user:String,execId:String): Unit ={ + QUERY_JOB_CACHE.put(user,execId) + } + + def removeJobCache(user:String,execId:String):Unit = { + QUERY_JOB_CACHE.remove(user, execId) + } + + def getJobCache(user:String):String = { + QUERY_JOB_CACHE.get(user) + } } \ No newline at end of file diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SqlCodeParse.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SqlCodeParse.scala index 39d89041c..ad7aca13a 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SqlCodeParse.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/SqlCodeParse.scala @@ -1,3 +1,19 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.entrance.spark import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisCSEntranceInterceptor.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisCSEntranceInterceptor.scala deleted file mode 100644 index 3bc46279d..000000000 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisCSEntranceInterceptor.scala +++ /dev/null @@ -1,27 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.entrance.spark - -import java.lang - -import org.apache.linkis.common.utils.Utils -import org.apache.linkis.entrance.cs.CSEntranceHelper -import org.apache.linkis.entrance.interceptor.impl.CSEntranceInterceptor -import org.apache.linkis.governance.common.entity.job.JobRequest -import org.apache.linkis.governance.common.entity.task.RequestPersistTask -import org.apache.linkis.protocol.task.Task - -class VisualisCSEntranceInterceptor extends CSEntranceInterceptor { - - override def apply(task: JobRequest, logAppender: lang.StringBuilder): JobRequest = { - task match { - case jobRequest : JobRequest => - logger.info("Start to execute CSEntranceInterceptor") - Utils.tryAndWarn(CSEntranceHelper.addCSVariable(jobRequest)) - //Utils.tryAndWarn(CSEntranceHelper.resetCreator(requestPersistTask)) - Utils.tryAndWarn(CSEntranceHelper.initNodeCSInfo(jobRequest)) - logger.info("Finished to execute CSEntranceInterceptor") - case _ => - } - task - } - -} diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisEntranceParser.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisEntranceParser.scala index c374fd7ba..be1533cd6 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisEntranceParser.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisEntranceParser.scala @@ -1,22 +1,38 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.entrance.spark -import org.apache.linkis.entrance.parser.CommonEntranceParser -import org.apache.linkis.governance.common.entity.job.JobRequest -import org.apache.linkis.governance.common.entity.task.RequestPersistTask -import org.apache.linkis.protocol.task.Task -import org.apache.linkis.scheduler.queue.Job +import com.webank.wedatasphere.linkis.entrance.parser.CommonEntranceParser +import com.webank.wedatasphere.linkis.protocol.query.RequestPersistTask +import com.webank.wedatasphere.linkis.protocol.task.Task +import com.webank.wedatasphere.linkis.scheduler.queue.Job import org.springframework.stereotype.Component - +/** + * Created by shanhuang on 2019/1/23. + */ @Component("entranceParser") -class VisualisEntranceParser(persistenceManager : VisualisPersistenceManager) extends CommonEntranceParser(persistenceManager) { +class VisualisEntranceParser extends CommonEntranceParser { - override def parseToJob(task: JobRequest): Job = { + override def parseToJob(task: Task): Job = { task match { - case requestPersistTask:RequestPersistTask => val job = new VisualisJob(persistenceManager) - job.setJobRequest(task) + case requestPersistTask:RequestPersistTask => val job = new VisualisJob + job.setTask(task) job.setUser(requestPersistTask.getUmUser) job.setCreator(requestPersistTask.getRequestApplicationName) - job.setParams(task.asInstanceOf[RequestPersistTask].getParams.asInstanceOf[java.util.Map[String, Any]]) job.setEntranceListenerBus(getEntranceContext.getOrCreateEventListenerBus) job.setProgress(0.0f) job diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisJob.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisJob.scala index 605de086b..b288b3dfb 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisJob.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisJob.scala @@ -1,12 +1,30 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.entrance.spark -import org.apache.linkis.entrance.job.EntranceExecutionJob -import org.apache.linkis.scheduler.queue.SchedulerEventState -import org.apache.linkis.scheduler.queue.SchedulerEventState.SchedulerEventState +import com.webank.wedatasphere.linkis.entrance.job.EntranceExecutionJob +import com.webank.wedatasphere.linkis.scheduler.queue.SchedulerEventState +import com.webank.wedatasphere.linkis.scheduler.queue.SchedulerEventState.SchedulerEventState import scala.collection.mutable.ArrayBuffer - -class VisualisJob(persistenceManager: VisualisPersistenceManager) extends EntranceExecutionJob(persistenceManager) { +/** + * Created by shanhuang on 2019/1/23. + */ +class VisualisJob extends EntranceExecutionJob { private val resultSets = ArrayBuffer[String]() def addResultSet(resultSet: String): Unit = resultSets += resultSet diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisPersistenceManager.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisPersistenceManager.scala index 33373de64..2e31e1b30 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisPersistenceManager.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/entrance/spark/VisualisPersistenceManager.scala @@ -1,12 +1,30 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.entrance.spark -import org.apache.linkis.entrance.persistence.{PersistenceEngine, QueryPersistenceManager, ResultSetEngine} -import org.apache.linkis.scheduler.executer.OutputExecuteResponse -import org.apache.linkis.scheduler.queue.Job +import com.webank.wedatasphere.linkis.entrance.persistence.{PersistenceEngine, QueryPersistenceManager, ResultSetEngine} +import com.webank.wedatasphere.linkis.scheduler.executer.OutputExecuteResponse +import com.webank.wedatasphere.linkis.scheduler.queue.Job import javax.annotation.PostConstruct import org.springframework.beans.factory.annotation.Autowired import org.springframework.stereotype.Component - +/** + * Created by shanhuang on 2019/1/23. + */ @Component("persistenceManager") class VisualisPersistenceManager extends QueryPersistenceManager { diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/exception/ResultTypeException.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/exception/ResultTypeException.scala index 03e4f2a09..c252a781f 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/exception/ResultTypeException.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/exception/ResultTypeException.scala @@ -1,6 +1,22 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.exception -import org.apache.linkis.common.exception.ErrorException +import com.webank.wedatasphere.linkis.common.exception.ErrorException /** * Created by allenlliu on 2019/1/24. diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ModelItem.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ModelItem.scala index 64d8e1371..dcfda18ea 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ModelItem.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ModelItem.scala @@ -1,3 +1,19 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.res /** diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ResultHelper.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ResultHelper.scala index 523c475ab..b741d7415 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ResultHelper.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/res/ResultHelper.scala @@ -1,16 +1,30 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.res import java.util -import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig -import org.apache.linkis.storage.domain._ -import org.apache.linkis.storage.resultset.table.TableMetaData -import org.apache.linkis.storage.resultset.{ResultSetFactory, ResultSetReader} + +import com.webank.wedatasphere.linkis.storage.domain._ +import com.webank.wedatasphere.linkis.storage.resultset.table.TableMetaData +import com.webank.wedatasphere.linkis.storage.resultset.{ResultSetFactory, ResultSetReader} import com.webank.wedatasphere.dss.visualis.exception.VGErrorException -import org.apache.linkis.adapt.LinkisUtils -import org.apache.linkis.common.io.FsPath -import org.apache.linkis.common.utils.Utils -import org.apache.linkis.server.BDPJettyServerHelper -import org.apache.linkis.storage.FSFactory +import com.webank.wedatasphere.linkis.adapt.LinkisUtils +import com.webank.wedatasphere.linkis.common.io.FsPath +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper import org.json4s.DefaultFormats /** @@ -19,7 +33,7 @@ import org.json4s.DefaultFormats object ResultHelper { implicit val formats = DefaultFormats def getResultByPath(path:String,limit:Long)={ - val resPath = new FsPath(getSchemaPath(path)) + val resPath = new FsPath(path) val rsFactory = ResultSetFactory.getInstance val resultSet = rsFactory.getResultSetByPath(resPath) if(ResultSetFactory.TABLE_TYPE != resultSet.resultSetType()){ @@ -27,29 +41,20 @@ object ResultHelper { } } - def getSchemaPath(path: String): String = { - if(path.startsWith(CommonConfig.RESULT_SET_SCHEMA.getValue)){ - path - } else { - CommonConfig.RESULT_SET_SCHEMA.getValue + path - } - } - @scala.throws[VGErrorException] def getResultType(path:String):Array[Column]={ - val resPath = new FsPath(path) + /*val resPath = new FsPath(path) val rsFactory = ResultSetFactory.getInstance val resultSet = rsFactory.getResultSetByPath(resPath) if(ResultSetFactory.TABLE_TYPE != resultSet.resultSetType()){ - throw new VGErrorException(70001,"不支持不是表格的结果集") + throw new VGErrorException(70001,"不支持不是表格的结果集") } val fs = FSFactory.getFs(resPath) fs.init(null) - val reader = ResultSetReader.getResultSetReader(resultSet,fs.read(resPath)) + val reader = ResultSetReader.getResultSetReader(resultSet,fs.read(resPath))*/ + val reader = ResultSetReader.getTableResultReader(path) val metaData = reader.getMetaData.asInstanceOf[TableMetaData] - Utils.tryQuietly(reader.close()) - Utils.tryQuietly(fs.close()) metaData.columns } @@ -73,12 +78,4 @@ object ResultHelper { case _ => "string" } - def toVisualType(sqlType: String): String = sqlType match { - case "TINYINT" | "SMALLINT" | "MEDIUMINT" | "INT" | "INTEGER" | "BIGINT" | "FLOAT" | "DOUBLE" | "DOUBLE PRECISION" | "REAL" | "DECIMAL" | "BIT" | "SERIAL" | "BOOL" | "BOOLEAN" | "DEC" | "FIXED" | "NUMERIC" => NUMBER_TYPE - case "DATE" | "DATETIME" | "TIMESTAMP" | "TIME" | "YEAR" => "date" - case _ => "string" - } - - - } diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisProtocol.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisProtocol.scala deleted file mode 100644 index bfa2ca8c9..000000000 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisProtocol.scala +++ /dev/null @@ -1,5 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.rpc - -class VisualisProtocol -class RequestKillTask(val execId: String) extends VisualisProtocol -class RequestJDBCResult(val execId: String) extends VisualisProtocol \ No newline at end of file diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisReceiver.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisReceiver.scala deleted file mode 100644 index 063491606..000000000 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisReceiver.scala +++ /dev/null @@ -1,21 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.rpc - -import com.webank.wedatasphere.dss.visualis.query.utils.JdbcAsyncUtils -import org.apache.linkis.rpc.{Receiver, Sender} -import edp.core.utils.SqlUtils - -import scala.concurrent.duration.Duration - -class VisualisReceiver(sqlUtils: SqlUtils) extends Receiver { - - override def receive(message: Any, sender: Sender): Unit = {} - - override def receiveAndReply(message: Any, sender: Sender): Any = message match { - case k: RequestKillTask => - sqlUtils.kill4Exec(k.execId, null) - case j: RequestJDBCResult => - JdbcAsyncUtils.getResult(j.execId) - } - - override def receiveAndReply(message: Any, duration: Duration, sender: Sender): Any = {} -} diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisReceiverChooser.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisReceiverChooser.scala deleted file mode 100644 index d2a3214da..000000000 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/rpc/VisualisReceiverChooser.scala +++ /dev/null @@ -1,23 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.rpc - -import org.apache.linkis.rpc.{RPCMessageEvent, Receiver, ReceiverChooser} -import edp.core.utils.SqlUtils -import javax.annotation.PostConstruct -import org.springframework.beans.factory.annotation.Autowired -import org.springframework.stereotype.Component - -@Component -class VisualisReceiverChooser extends ReceiverChooser { - - @Autowired - private var sqlUtils: SqlUtils = _ - private var receiver: Option[VisualisReceiver] = _ - - @PostConstruct - def init(): Unit = receiver = Some(new VisualisReceiver(sqlUtils)) - - override def chooseReceiver(event: RPCMessageEvent): Option[Receiver] = event.message match { - case _: VisualisProtocol => receiver - case _ => None - } -} diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/ujes/UJESJob.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/ujes/UJESJob.scala index 02272d661..380efec96 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/ujes/UJESJob.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/ujes/UJESJob.scala @@ -1,49 +1,29 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.ujes -import com.webank.wedatasphere.dss.visualis.utils.VisualisUtils +import com.webank.wedatasphere.linkis.server.JMap /** * Created by johnnwang on 2019/1/26. */ -case class UJESJob(var code:String, - var user:String, - var jobType:String, - var source:Any, - var creator: String = VisualisUtils.VG_CREATOR.getValue, - var engine: String = VisualisUtils.SPARK.getValue, - var nodeName: String = "", - var contextId: String = "", - var cache: Boolean = false, - var cacheExpireAfter : Long = 300L, - var readFromCache: Boolean = false, - var readCacheBefore: Long = 300L - ) +case class UJESJob(code:String,user:String,jobType:String, source:Any) object UJESJob{ val SQL_TYPE = "sql" - val PSQL_TYPE = "psql" val SCALA_TYPE ="scala" - val SPARK_ENGINE ="spark" - val PRESTO_ENGINE ="presto" - val CONTEXT_ID ="contextId" - - def apply(code: String, - user: String, - jobType: String, - source: Any): UJESJob = new UJESJob(code, user, jobType, source) - def apply(code: String, - user: String, - jobType: String, - source: Any, - cache: Boolean, - cacheExpireAfter : Long, - readFromCache: Boolean, - readCacheBefore: Long): UJESJob = { - val ujesJob = new UJESJob(code, user, jobType, source) - ujesJob.cache = cache - ujesJob.cacheExpireAfter = cacheExpireAfter - ujesJob.readFromCache = readFromCache - ujesJob.readCacheBefore = readCacheBefore - ujesJob - } } diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/UserConfiguration.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/UserConfiguration.scala new file mode 100644 index 000000000..28547cb6e --- /dev/null +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/UserConfiguration.scala @@ -0,0 +1,45 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.visualis.utils + +import java.util + +import com.webank.wedatasphere.linkis.common.utils.{Logging, Utils} +import com.webank.wedatasphere.linkis.protocol.CacheableProtocol +import com.webank.wedatasphere.linkis.protocol.config.{RequestQueryAppConfigWithGlobal, ResponseQueryConfig} +import com.webank.wedatasphere.linkis.rpc.RPCMapCache + +/** + * Created by johnnwang on 2018/12/11. + */ +object UserConfiguration extends + RPCMapCache[RequestQueryAppConfigWithGlobal, String, String](VisualisUtils.CLOUD_CONSOLE_CONFIGURATION_SPRING_APPLICATION_NAME.getValue) with Logging{ + + override protected def createRequest(req: RequestQueryAppConfigWithGlobal): CacheableProtocol = { + req + } + + override protected def createMap(any: Any): util.Map[String, String] = any match { + case response: ResponseQueryConfig => response.getKeyAndValue + } + + override def getCacheMap(key: RequestQueryAppConfigWithGlobal): util.Map[String, String] = { + Utils.tryCatch(super.getCacheMap(key)){case error:Throwable => warn(s"Failed to get Configuration:$key ", error) + null + } + } +} diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisSpringConfiguration.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisSpringConfiguration.scala index b113bfa49..3056400d0 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisSpringConfiguration.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisSpringConfiguration.scala @@ -1,35 +1,35 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.utils -import com.webank.wedatasphere.dss.visualis.entrance.spark.{VisualisCSEntranceInterceptor, VisualisEntranceParser} -import org.apache.linkis.entrance.EntranceParser -import org.apache.linkis.entrance.annotation._ -import org.apache.linkis.entrance.conf.EntranceConfiguration.ENTRANCE_SCHEDULER_MAX_PARALLELISM_USERS -import org.apache.linkis.entrance.execute.impl.EntranceExecutorManagerImpl -import org.apache.linkis.entrance.execute._ -import org.apache.linkis.entrance.interceptor.EntranceInterceptor -import org.apache.linkis.entrance.interceptor.impl._ -import org.apache.linkis.entrance.persistence.PersistenceManager -import org.apache.linkis.entrance.scheduler.EntranceSchedulerContext -import org.apache.linkis.entrance.scheduler.cache.ReadCacheConsumerManager -import org.apache.linkis.scheduler.SchedulerContext -import org.apache.linkis.scheduler.executer.ExecutorManager -import org.apache.linkis.scheduler.queue.{ConsumerManager, GroupFactory} -import org.springframework.beans.factory.annotation.Autowired +import com.webank.wedatasphere.dss.visualis.entrance.spark.VisualisEntranceParser +import com.webank.wedatasphere.linkis.entrance.EntranceParser import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean import org.springframework.context.annotation.{Bean, Configuration, Primary} - +/** + * Created by shanhuang on 2019/1/23. + */ @Configuration class VisualisSpringConfiguration { @Primary - @EntranceInterceptorBeanAnnotation - def generateEntranceInterceptors: Array[EntranceInterceptor] = Array[EntranceInterceptor](new VisualisCSEntranceInterceptor, new ShellDangerousGrammerInterceptor, new PythonCodeCheckInterceptor, new DBInfoCompleteInterceptor, new SparkCodeCheckInterceptor, new SQLCodeCheckInterceptor, new VarSubstitutionInterceptor, new LogPathCreateInterceptor, new StorePathEntranceInterceptor, new ScalaCodeInterceptor, new SQLLimitEntranceInterceptor, new CommentInterceptor, new PythonCodeCheckInterceptor) - - @Primary - @ConsumerManagerBeanAnnotation - def generateConsumerManager(@PersistenceManagerBeanAnnotation.PersistenceManagerAutowiredAnnotation persistenceManager: PersistenceManager) = new ReadCacheConsumerManager(ENTRANCE_SCHEDULER_MAX_PARALLELISM_USERS.getValue, persistenceManager) + @Bean(Array("entranceParser")) + @ConditionalOnMissingBean + def createEntranceParser(): EntranceParser = new VisualisEntranceParser - @SchedulerContextBeanAnnotation - def generateSchedulerContext(@GroupFactoryBeanAnnotation.GroupFactoryAutowiredAnnotation groupFactory: GroupFactory, @EntranceExecutorManagerBeanAnnotation.EntranceExecutorManagerAutowiredAnnotation executorManager: ExecutorManager, @ConsumerManagerBeanAnnotation.ConsumerManagerAutowiredAnnotation consumerManager: ConsumerManager) = new EntranceSchedulerContext(groupFactory, consumerManager, executorManager) } diff --git a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisUtils.scala b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisUtils.scala index e28a5f03b..62a9037b6 100644 --- a/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisUtils.scala +++ b/server/src/main/scala/com/webank/wedatasphere/dss/visualis/utils/VisualisUtils.scala @@ -1,79 +1,53 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ package com.webank.wedatasphere.dss.visualis.utils -import java.util.Date -import com.google.common.collect.Lists -import com.webank.wedatasphere.dss.visualis.model.{DWCResultInfo, PaginateWithExecStatus} -import com.webank.wedatasphere.dss.visualis.rpc.{RequestJDBCResult, RequestKillTask} -import org.apache.linkis.common.ServiceInstance -import org.apache.linkis.common.conf.CommonVars -import org.apache.linkis.entrance.conf.EntranceConfiguration -import org.apache.linkis.governance.common.entity.task.{RequestOneTask, RequestPersistTask} -import org.apache.linkis.manager.label.entity.engine.RunType -import org.apache.linkis.protocol.query.cache.RequestDeleteCache -import org.apache.linkis.rpc.Sender -import org.apache.linkis.ujes.client.response.JobInfoResult -import edp.core.model.{BaseSource, PaginateWithQueryColumns} -import edp.davinci.model.View -import org.apache.linkis.adapt.LinkisUtils - +import java.awt.Image +import java.awt.image.BufferedImage +import java.io.File +import java.util + +import com.webank.wedatasphere.linkis.common.conf.CommonVars +import com.webank.wedatasphere.dss.visualis.configuration.CommonConfig +import com.webank.wedatasphere.dss.visualis.model.DWCResultInfo +import com.webank.wedatasphere.linkis.adapt.LinkisUtils +import edp.core.model.BaseSource +import edp.davinci.dao.ProjectMapper +import edp.davinci.model.{Project, View} +import javax.imageio.ImageIO + +/** + * Created by johnnwang on 2019/1/23. + */ object VisualisUtils { - val sender = Sender.getSender(EntranceConfiguration.QUERY_PERSISTENCE_SPRING_APPLICATION_NAME.getValue) - val HIVE_DATA_SOURCE_TOKEN = CommonVars("wds.dss.visualis.hive.datasource.token","hiveDataSource-token") val HIVE_DATA_SOURCE_ID = CommonVars("wds.dss.visualis.hive.datasource.id",1) - val PRESTO_DATA_SOURCE_TOKEN = CommonVars("wds.dss.visualis.presto.datasource.token","prestoDataSource-token") - val PRESTO_DATA_SOURCE_ID = CommonVars("wds.dss.visualis.presto.datasource.id",210) val TMP_VIEW_NAME = CommonVars("wds.dss.visualis.view.tmp.name","tmpView") val DEFAULT_PROJECT_NAME = CommonVars("wds.dss.visualis.project.name","Public Project") val DEFAULT_PROJECT_ID = CommonVars[Long]("wds.dss.visualis.project.id",-1) val DWC_RESULT_INFO = CommonVars("wds.dss.visualis.result.info","dwcResultInfo") + val BDP_DWC_INSTANCE = CommonVars[Integer]("wds.dss.instance",1) + val BDP_DWC_VG_QUERY_TIMEOUT = CommonVars[Long]("wds.dss.visualis.query.timeout",1000 * 60 * 10) + val CLOUD_CONSOLE_CONFIGURATION_SPRING_APPLICATION_NAME = CommonVars("wds.dss.visualis.conf.application.name", "cloud-configuration") val VG_CREATOR = CommonVars("wds.dss.visualis.creator", "Visualis") - val SPARK = CommonVars("wds.dss.visualis.spark.name", "spark") - val PRESTO = CommonVars("wds.dss.visualis.presto.name", "presto") - val RESULT_FILE_NAME = CommonVars("wds.dss.visualis.result.file.name", "/_0.dolphin") - val SCALA_RESULT_FILE_NAME = CommonVars("wds.dss.visualis.result.file.name", "/_1.dolphin") - val HA_EXEC_ID_SEPARATOR = CommonVars("wds.dss.visualis.ha.exec.id.separator", "@") - val TASK_SEARCH_TIME = CommonVars("wds.dss.visualis.task.search.time", 1000*60*60*24L) - val AVAILABLE_ENGINE_TYPES = CommonVars("wds.dss.visualis.available.engine.types", "spark;presto") - - def getDataSourceName(engineType: String) = { - if(SPARK.getValue.equals(engineType)){ - "hive" - } else { - PRESTO.getValue - } - } - - def getCreator(engineType: String) = { - if(SPARK.getValue.equals(engineType)){ - VG_CREATOR.getValue - } else { - "IDE" - } - } - - def isLinkisDataSource(source:BaseSource):Boolean ={ - isHiveDataSource(source) || isPrestoDataSource(source) - } - - def isPrestoDataSource(source:BaseSource):Boolean ={ - val prestoDataSourceToken = PRESTO_DATA_SOURCE_TOKEN.getValue - prestoDataSourceToken.equals(source.getUsername) - } - - def getPrestoDataSourceId():Long = { - PRESTO_DATA_SOURCE_ID.getValue - } - - def getResultSetPath(jobInfo: JobInfoResult) = { - if(RunType.SCALA.toString.equalsIgnoreCase(jobInfo.getRequestPersistTask.getRunType)){ - jobInfo.getRequestPersistTask.getResultLocation + VisualisUtils.SCALA_RESULT_FILE_NAME.getValue - } else { - jobInfo.getRequestPersistTask.getResultLocation + VisualisUtils.RESULT_FILE_NAME.getValue - } - } + val VG_APP_NAME = CommonVars("wds.dss.visualis.app.name", "spark") + val RESULT_FILE_NAME = CommonVars("wds.dss.visualis.result.file.name,", "/_0.dolphin") def isHiveDataSource(source:BaseSource):Boolean ={ val hiveDataSourceToken = HIVE_DATA_SOURCE_TOKEN.getValue @@ -99,52 +73,8 @@ object VisualisUtils { scala } - def buildScala(sql: String, resultLocation: String, tableName:String):String ={ - val scala: String = "val sql = \"\"\" " + sql +"\"\"\"" + s""" \norg.apache.spark.sql.execution.datasources.csv.DolphinToSpark.createTempView""" + - s"""(spark,"$tableName","${resultLocation}", true);show(spark.sql(sql))""" - scala - } - def getUserTicketKV(username: String): (String, String) = { LinkisUtils.getUserTicketKV(username) } - def deleteCache(executionCode: String, user: String) = { - sender.ask(new RequestDeleteCache(executionCode, SPARK.getValue, Lists.newArrayList(user))) - } - - def getHAExecId(execId: String) : String = { - execId + HA_EXEC_ID_SEPARATOR.getValue + Sender.getThisInstance - } - - def getInstanceByHAExecId(haExecId: String) : String = { - haExecId.split(HA_EXEC_ID_SEPARATOR.getValue)(1) - } - - def getExecIdByHAExecId(haExecId: String) : String = { - haExecId.split(HA_EXEC_ID_SEPARATOR.getValue)(0) - } - - def getQueryTask(instance: String, execId: String) : RequestPersistTask = { - val requestOneTask = new RequestOneTask - requestOneTask.setInstance(instance) - requestOneTask.setExecId(VisualisUtils.getExecIdByHAExecId(execId)) - requestOneTask.setExecuteApplicationName(VisualisUtils.SPARK.getValue) - val currentTimeMillis = System.currentTimeMillis() - requestOneTask.setStartTime(new Date(currentTimeMillis - TASK_SEARCH_TIME.getValue)) - requestOneTask.setEndTime(new Date(currentTimeMillis)) - sender.ask(requestOneTask).asInstanceOf[RequestPersistTask] - } - - def killHA(instance: String, execId: String) : PaginateWithExecStatus = { - val requestKillTask = new RequestKillTask(execId) - Sender.getSender(ServiceInstance(Sender.getThisServiceInstance.getApplicationName, instance)) - .ask(requestKillTask).asInstanceOf[PaginateWithExecStatus] - } - - def getJDBCResult(instance: String, execId: String) : PaginateWithQueryColumns = { - Sender.getSender(ServiceInstance(Sender.getThisServiceInstance.getApplicationName, instance)) - .ask(new RequestJDBCResult(execId)).asInstanceOf[PaginateWithQueryColumns] - } - } diff --git a/server/src/main/scala/com/webank/wedatasphere/linkis/adapt/ComponentRegister.scala b/server/src/main/scala/com/webank/wedatasphere/linkis/adapt/ComponentRegister.scala new file mode 100644 index 000000000..08680610f --- /dev/null +++ b/server/src/main/scala/com/webank/wedatasphere/linkis/adapt/ComponentRegister.scala @@ -0,0 +1,28 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.linkis.adapt + +import org.springframework.context.annotation.ComponentScan +import org.springframework.stereotype.Component +/** + * Created by shanhuang on 2019/1/23. + */ +@Component +@ComponentScan(basePackages = Array("edp", "com.webank.wedatasphere.dss")) +class ComponentRegister { + +} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisQueryService.java b/server/src/main/scala/com/webank/wedatasphere/linkis/adapt/LinkisUtils.scala similarity index 50% rename from visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisQueryService.java rename to server/src/main/scala/com/webank/wedatasphere/linkis/adapt/LinkisUtils.scala index f0c31dc8b..78810659f 100644 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisQueryService.java +++ b/server/src/main/scala/com/webank/wedatasphere/linkis/adapt/LinkisUtils.scala @@ -1,7 +1,8 @@ /* * Copyright 2019 WeBank + * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -13,17 +14,19 @@ * limitations under the License. * */ +package com.webank.wedatasphere.linkis.adapt -package com.webank.wedatasphere.dss.appconn.visualis.service; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefQueryOperation; -import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefQueryService; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper +import com.webank.wedatasphere.linkis.server.security.SSOUtils +/** + * Created by shanhuang on 2019/1/23. + */ +object LinkisUtils { -public class VisualisQueryService extends AbstractRefQueryService { + val gson = BDPJettyServerHelper.gson - @Override - public VisualisRefQueryOperation createRefQueryOperation() { - return new VisualisRefQueryOperation(); - } + def getUserTicketKV(username :String) = { + SSOUtils.getUserTicketKV(username) + } } diff --git a/server/src/main/scala/org/apache/linkis/adapt/ComponentRegister.scala b/server/src/main/scala/org/apache/linkis/adapt/ComponentRegister.scala deleted file mode 100644 index 5ecc06d54..000000000 --- a/server/src/main/scala/org/apache/linkis/adapt/ComponentRegister.scala +++ /dev/null @@ -1,10 +0,0 @@ -package org.apache.linkis.adapt - -import org.springframework.context.annotation.ComponentScan -import org.springframework.stereotype.Component - -@Component -@ComponentScan(basePackages = Array("edp", "com.webank.wedatasphere.dss")) -class ComponentRegister { - -} diff --git a/server/src/main/scala/org/apache/linkis/adapt/LinkisUtils.scala b/server/src/main/scala/org/apache/linkis/adapt/LinkisUtils.scala deleted file mode 100644 index 348c77410..000000000 --- a/server/src/main/scala/org/apache/linkis/adapt/LinkisUtils.scala +++ /dev/null @@ -1,28 +0,0 @@ -package org.apache.linkis.adapt - -import com.google.gson._ -import org.apache.linkis.server.BDPJettyServerHelper -import org.apache.linkis.server.security.SSOUtils - -import java.lang -import java.lang.reflect.Type - -/** - * @author: jinyangrao on 2022/3/16 - * @description: - */ -object LinkisUtils { - - val gson = BDPJettyServerHelper.gson - - val gsonNoContert = new GsonBuilder().disableHtmlEscaping().setDateFormat("yyyy-MM-dd'T'HH:mm:ssZ").serializeNulls - .registerTypeAdapter(classOf[java.lang.Double], new JsonSerializer[java.lang.Double] { - override def serialize(t: lang.Double, `type`: Type, jsonSerializationContext: JsonSerializationContext): JsonElement = - if (t == t.longValue()) new JsonPrimitive(t.longValue()) else new JsonPrimitive(t) - }).create - - def getUserTicketKV(username: String) = { - SSOUtils.getUserTicketKV(username) - } - -} diff --git a/server/src/test/java/com/webank/wedatasphere/dss/visualis/utils/export/WidgetMigrationTest.java b/server/src/test/java/com/webank/wedatasphere/dss/visualis/utils/export/WidgetMigrationTest.java deleted file mode 100644 index 67b57b433..000000000 --- a/server/src/test/java/com/webank/wedatasphere/dss/visualis/utils/export/WidgetMigrationTest.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.webank.wedatasphere.dss.visualis.utils.export; - -import org.apache.commons.lang.StringUtils; - -public class WidgetMigrationTest { - - public static void main(String[] args) throws Exception { -// String oldConfig = "{\"pageSize\":\"500\",\"cols\":[\"id\",\"name\",\"sex\",\"class\",\"lesson\",\"city\",\"teacher\",\"birthday\"],\"rows\":[],\"metrics\":[{\"name\":\"age@davinci@F5269200\",\"type\":\"value\",\"visualType\":\"string\",\"agg\":\"sum\",\"config\":true,\"chart\":{\"id\":1,\"name\":\"pivot\",\"title\":\"透视表\",\"icon\":\"icon-table\",\"coordinate\":\"cartesian\",\"requireDimetions\":[0,9999],\"requireMetrics\":[0,9999],\"data\":{\"color\":{\"title\":\"颜色\",\"type\":\"category\"}},\"style\":{\"pivot\":{\"fontFamily\":\"PingFang SC\",\"fontSize\":\"12\",\"color\":\"#666\",\"lineStyle\":\"solid\",\"lineColor\":\"#D9D9D9\",\"headerBackgroundColor\":\"#f7f7f7\"}}}},{\"name\":\"score@davinci@893072DB\",\"type\":\"value\",\"visualType\":\"string\",\"agg\":\"sum\",\"config\":true,\"chart\":{\"id\":1,\"name\":\"pivot\",\"title\":\"透视表\",\"icon\":\"icon-table\",\"coordinate\":\"cartesian\",\"requireDimetions\":[0,9999],\"requireMetrics\":[0,9999],\"data\":{\"color\":{\"title\":\"颜色\",\"type\":\"category\"}},\"style\":{\"pivot\":{\"fontFamily\":\"PingFang SC\",\"fontSize\":\"12\",\"color\":\"#666\",\"lineStyle\":\"solid\",\"lineColor\":\"#D9D9D9\",\"headerBackgroundColor\":\"#f7f7f7\"}}}},{\"name\":\"fee@davinci@A5BB2EC4\",\"type\":\"value\",\"visualType\":\"string\",\"agg\":\"sum\",\"config\":true,\"chart\":{\"id\":1,\"name\":\"pivot\",\"title\":\"透视表\",\"icon\":\"icon-table\",\"coordinate\":\"cartesian\",\"requireDimetions\":[0,9999],\"requireMetrics\":[0,9999],\"data\":{\"color\":{\"title\":\"颜色\",\"type\":\"category\"}},\"style\":{\"pivot\":{\"fontFamily\":\"PingFang SC\",\"fontSize\":\"12\",\"color\":\"#666\",\"lineStyle\":\"solid\",\"lineColor\":\"#D9D9D9\",\"headerBackgroundColor\":\"#f7f7f7\"}}}}],\"filters\":[],\"chartStyles\":{\"table\":{\"fontFamily\":\"PingFang SC\",\"fontSize\":\"12\",\"color\":\"#666\",\"lineStyle\":\"solid\",\"lineColor\":\"#D9D9D9\",\"headerBackgroundColor\":\"#f7f7f7\"}},\"selectedChart\":1,\"data\":[],\"renderType\":\"rerender\",\"orders\":[],\"mode\":\"chart\",\"model\":{\"id\":{\"sqlType\":\"int\",\"visualType\":\"string\",\"modelType\":\"category\"},\"name\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"},\"sex\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"},\"age\":{\"sqlType\":\"int\",\"visualType\":\"string\",\"modelType\":\"value\"},\"class\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"},\"lesson\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"},\"city\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"},\"teacher\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"},\"score\":{\"sqlType\":\"double\",\"visualType\":\"string\",\"modelType\":\"value\"},\"fee\":{\"sqlType\":\"double\",\"visualType\":\"string\",\"modelType\":\"value\"},\"birthday\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"},\"exam_date\":{\"sqlType\":\"string\",\"visualType\":\"string\",\"modelType\":\"category\"}},\"queryParams\":[],\"cache\":false,\"expired\":300,\"columnsWidth\":{\"id\":168.75,\"name\":168.75,\"sex\":168.75,\"class\":168.75,\"lesson\":168.75,\"city\":168.75,\"teacher\":168.75,\"birthday\":168.75,\"sum(age)\":122.72727272727273,\"sum(score)\":122.72727272727273,\"sum(fee)\":122.72727272727273},\"columnsShowAsPercent\":{}}"; -// String newConfig = WidgetMigration.migrate(oldConfig, 297L); -// System.out.println(newConfig); - String a = "ffffff029aa07098c790192de951383f8a076a467701a8dbcf8704d9f098d8c2c94cd610c2a4a0b9bd9303b6d2379cc0101700d156512d638e02b847d517902b3a94573a6bd053ccf7d522ee6312603e3eaec01aa4bb4910051c5f2efa0c53ac36491f20c121a3170193210e3e20823628de9dd5d42e67d51596a5a0c4995961a6f253275793e56ac3862eaa9d5afd26fb308a4f0f7d48209ec7c585e23a37dd8ec9e90512b65b299017a6a44d21f37422c880c8b3c795225341e374fbf9a63acadd4b39cf726c1f04bd59a1bc96955e3f100dad1b3be21f3f13894eb240aec7858997beb2ffe20ba614533734896a52ab9e7455377c4186b9794e19397f76a0c4732baa@MIIEvAIBADANBgkqhkiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQC8G6Vz79UNIq7AxR22tnW9ib0VTrgTiXkQ5DkujN1FmgDG/WeXAQwpblysYcEBPeuZ044QejDruvnnONeSzqYwPK5T7LfZNaymFHqFpQyxFx94FanmHdyxAFpoNPZ2Lakv6vExe+Gh0Cgj1TVtJ/hTmryZwmcKWfW3A5STLcEJLxUf3lBrAdi9N77U0g0v5okbdbiD0l8TCgU+cM5AcYWvKi00yzXNkG3c8PI2wys+WgFUsVVQBPtGmZeQSaN+mp/MO907gX0xIFqfqyTaNDJYlSDgUpeCyp5dTP45ils1bwPDRDkTkkMYy2vHWTNfyZWiiF6hLtKmI8myCLboU9ITAgMBAAECggEANxpqJ0I0SPrF8lZL1AAzEWjN6PX8WkzFGDuivI4rK35nh+Mne0alR2W65Axmu3RmFdOxJAaHWiaVmjQ+ghTi/fJoptELMifVAXmyQoAM7bt2TnkaIfzRb1BJK4mIQSozC4RpTzOY7wvJFmYYlndE+Ui0wt39zTx5DDmSRmL6zzNoTG5pPgmN/zq2icbhXqD0DP8wxw4AFKJWdrcMkkjRfKkByqA03bymfQqIz5uHril60o3xuuTyBPR74bnPdJE4ONahQHIgvWj/aQYqNyaapJJ8C194Acin0hl1QRv30+syM95QBdLEbLPAUHo2ClWRJZ6cqDPZe2a2N5YpbXtIQQKBgQD0iremr8O42yENqfAXK06Cek30E6wHTJ/RYipcveVg34rXsJ+yTls7SV6tyCqNknZzMoRw\" + \"QrIs74TN+KDF52wudrTpbsf5IKJQyExOz7LxTJ76h2OVA9zM1/MPtOHLt5mnwhLTtmxhVZ54CXbkw237pSagG+HhLyrO8S4mIweH8QKBgQDE7AEClojuj5cwRH46ic2s/oIuBObNFeJcRvxx+ONNdlOWOKRi6FhfHlhzLoFDUci2bjn1fvP1EMYZ+KkXATyezgIjJnnClXFpsORhUWh0SiqS3gVJeSIEDKeuh9esRPXk/cyPa3V8o5HouWWDitday0Xsnw51/sVTbN3b5z0eQwKBgEys9hqgv+jFZJ7JKwvIu2wz9x9Rz73WK8JWWlwL+tEeJoWsztX0taxoO/SXb6hGRTennlkowH9Qdr6yd462GniTJfSPlMorjllwBGUtwLjiQnLhYrsFpATirUa+e5IJtncgZhDWATOfyflvVkUyddjSlsLbGz8lL/IFM2gn0aOxAoGADJlQ4zqAXkr/kE4BiXtBlnzeFVWo8pwg1GiSRDR5Tn5wkJ7lHZLh/IvzesMR8B2uasWYnbVWpGpDUmwPXXJtz3c8ucT/a0ymae2wXu2XckFAgg8EZZQDciDhJZB5YwMyfEkkqlRkuum4LxyVexoJ9zwkKCRxB2madGD1vNkJlwMCgYANF7fiG6k3D45Nopu8iTbi3S9oOnhTWxpwxSJWTUij4HFmtSXjJfgOPG9rVvO5QCaHWDWHE/LyZ/Y51ustAV3uj5UmnGwXQDNEgNZUFe4vwzYq7ikXoE6zCTzs70DT/4llos5g1rs8feuWrJK19DPKrenxyOLI6pPA9GjMgC1aEg=="; - String[] b = StringUtils.split(a,"@"); - b.toString(); - - } -} diff --git a/visualis-appconn/pom.xml b/visualis-appconn/pom.xml deleted file mode 100644 index f80592d5f..000000000 --- a/visualis-appconn/pom.xml +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - visualis - com.webank.wedatasphere.dss - 1.0.0 - ../pom.xml - - 4.0.0 - - dss-visualis-appconn - - - - com.webank.wedatasphere.dss - dss-project-plugin - ${opensource.dss.version} - - - com.webank.wedatasphere.dss - dss-appconn-core - ${opensource.dss.version} - - - com.webank.wedatasphere.dss - spring-origin-dss-project-plugin - ${opensource.dss.version} - - - - com.webank.wedatasphere.dss - dss-structure-integration-standard - ${opensource.dss.version} - - - - com.webank.wedatasphere.dss - dss-development-process-standard - ${opensource.dss.version} - - - com.webank.wedatasphere.dss - dss-development-process-standard-execution - ${opensource.dss.version} - - - - org.apache.linkis - linkis-module - ${apache.linkis.version} - provided - - - httpclient - org.apache.httpcomponents - - - true - - - org.apache.linkis - linkis-cs-common - ${apache.linkis.version} - compile - - - linkis-bml-client - - - gson - com.google.code.gson - - - org.apache.linkis - ${apache.linkis.version} - provided - true - - - - org.apache.linkis - linkis-httpclient - ${apache.linkis.version} - - - linkis-common - org.apache.linkis - - - json4s-jackson_2.11 - org.json4s - - - - - - org.apache.linkis - linkis-storage - ${apache.linkis.version} - provided - - - linkis-common - org.apache.linkis - - - - - - com.webank.wedatasphere.dss - dss-common - ${opensource.dss.version} - provided - - - - org.apache.linkis - linkis-cs-client - ${apache.linkis.version} - compile - - - - - - - - - org.apache.maven.plugins - maven-deploy-plugin - - - - net.alchim31.maven - scala-maven-plugin - - - org.apache.maven.plugins - maven-jar-plugin - - - org.apache.maven.plugins - maven-assembly-plugin - 2.3 - false - - - make-assembly - package - - single - - - - src/main/assembly/distribution.xml - - - - - - false - visualis - false - false - - src/main/assembly/distribution.xml - - - - - org.apache.maven.plugins - maven-gpg-plugin - - true - - - - - - src/main/java - - **/*.xml - - - - src/main/resources - - **/*.properties - **/application.yml - **/bootstrap.yml - **/log4j2.xml - - - - - - - \ No newline at end of file diff --git a/visualis-appconn/src/main/assembly/distribution.xml b/visualis-appconn/src/main/assembly/distribution.xml deleted file mode 100644 index 720ed5338..000000000 --- a/visualis-appconn/src/main/assembly/distribution.xml +++ /dev/null @@ -1,80 +0,0 @@ - - - - dss-visualis-appconn - - zip - - true - visualis - - - - - - lib - true - true - false - true - true - - - - - - ${basedir}/conf - - * - - 0777 - conf - unix - - - . - - */** - - logs - - - - ${basedir}/src/main/resources - - init.sql - - 0777 - db - - - - ${basedir}/src/main/icons - - * - - 0777 - icons - - - - - - diff --git a/visualis-appconn/src/main/icons/dashboard.icon b/visualis-appconn/src/main/icons/dashboard.icon deleted file mode 100644 index 3035bc360..000000000 --- a/visualis-appconn/src/main/icons/dashboard.icon +++ /dev/null @@ -1 +0,0 @@ -dashboardCreated with Sketch. \ No newline at end of file diff --git a/visualis-appconn/src/main/icons/display.icon b/visualis-appconn/src/main/icons/display.icon deleted file mode 100644 index d7d295aa2..000000000 --- a/visualis-appconn/src/main/icons/display.icon +++ /dev/null @@ -1 +0,0 @@ -displayCreated with Sketch. \ No newline at end of file diff --git a/visualis-appconn/src/main/icons/view.icon b/visualis-appconn/src/main/icons/view.icon deleted file mode 100644 index 4f786027e..000000000 --- a/visualis-appconn/src/main/icons/view.icon +++ /dev/null @@ -1 +0,0 @@ -displayCreated with Sketch. \ No newline at end of file diff --git a/visualis-appconn/src/main/icons/widget.icon b/visualis-appconn/src/main/icons/widget.icon deleted file mode 100644 index fad965652..000000000 --- a/visualis-appconn/src/main/icons/widget.icon +++ /dev/null @@ -1 +0,0 @@ -widgetCreated with Sketch. \ No newline at end of file diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisAppConn.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisAppConn.java deleted file mode 100644 index 7d0711301..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisAppConn.java +++ /dev/null @@ -1,48 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis; - -import com.webank.wedatasphere.dss.appconn.core.ext.ThirdlyAppConn; -import com.webank.wedatasphere.dss.appconn.core.impl.AbstractOnlySSOAppConn; -import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; -import com.webank.wedatasphere.dss.standard.app.structure.StructureIntegrationStandard; -import org.apache.linkis.common.conf.CommonVars; - -public class VisualisAppConn extends AbstractOnlySSOAppConn implements ThirdlyAppConn { - - public static final String VISUALIS_APPCONN_NAME = CommonVars.apply("wds.dss.appconn.visualis.name", "Visualis").getValue(); - - private VisualisDevelopmentIntegrationStandard developmentIntegrationStandard; - private VisualisStructureIntegrationStandard structureIntegrationStandard; - - @Override - protected void initialize() { - structureIntegrationStandard = new VisualisStructureIntegrationStandard(); - developmentIntegrationStandard = new VisualisDevelopmentIntegrationStandard(); - } - - @Override - public StructureIntegrationStandard getOrCreateStructureStandard() { - return structureIntegrationStandard; - } - - @Override - public DevelopmentIntegrationStandard getOrCreateDevelopmentStandard() { - return developmentIntegrationStandard; - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisDevelopmentIntegrationStandard.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisDevelopmentIntegrationStandard.java deleted file mode 100644 index 3a9503d94..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisDevelopmentIntegrationStandard.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.OperationStrategyFactory; -import com.webank.wedatasphere.dss.appconn.visualis.service.VisualisExecutionService; -import com.webank.wedatasphere.dss.appconn.visualis.service.*; -import com.webank.wedatasphere.dss.standard.app.development.service.*; -import com.webank.wedatasphere.dss.standard.app.development.standard.AbstractDevelopmentIntegrationStandard; -import com.webank.wedatasphere.dss.standard.common.exception.AppStandardErrorException; - -public class VisualisDevelopmentIntegrationStandard extends AbstractDevelopmentIntegrationStandard { - - @Override - public void init() throws AppStandardErrorException { - OperationStrategyFactory.setSsoRequestOperation(ssoRequestService.createSSORequestOperation(VisualisAppConn.VISUALIS_APPCONN_NAME)); - super.init(); - } - - @Override - protected RefCRUDService createRefCRUDService() { - return new VisualisCRUDService(); - } - - @Override - protected RefExecutionService createRefExecutionService() { - return new VisualisExecutionService(); - } - - @Override - protected RefExportService createRefExportService() { - return new VisualisRefExportService(); - } - - @Override - protected RefImportService createRefImportService() { - return new VisualisRefImportService(); - } - - @Override - protected RefQueryService createRefQueryService() { - return new VisualisQueryService(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisStructureIntegrationStandard.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisStructureIntegrationStandard.java deleted file mode 100644 index c35fb1357..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisStructureIntegrationStandard.java +++ /dev/null @@ -1,30 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis; - -import com.webank.wedatasphere.dss.appconn.visualis.project.VisualisProjectService; -import com.webank.wedatasphere.dss.standard.app.structure.AbstractStructureIntegrationStandard; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; - - -public class VisualisStructureIntegrationStandard extends AbstractStructureIntegrationStandard { - - @Override - protected ProjectService createProjectService() { - return new VisualisProjectService(); - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/constant/VisualisConstant.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/constant/VisualisConstant.java deleted file mode 100644 index bafab65b5..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/constant/VisualisConstant.java +++ /dev/null @@ -1,25 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.constant; - -public interface VisualisConstant { - - String NODE_NAME_PREFIX = "linkis.appconn."; - - String DASHBOARD_PORTAL_IDS = "dashboardPortalIds"; - String DISPLAY_IDS = "displayIds"; - String WIDGET_IDS = "widgetIds"; - String VIEW_IDS = "viewIds"; - - String DASHBOARD_OPERATION_STRATEGY = "linkis.appconn.visualis.dashboard"; - String DISPLAY_OPERATION_STRATEGY = "linkis.appconn.visualis.display"; - String WIDGET_OPERATION_STRATEGY = "linkis.appconn.visualis.widget"; - String VIEW_OPERATION_STRATEGY = "linkis.appconn.visualis.view"; - - String EXECUTION_SCHEDULED = "Scheduled"; - String EXECUTION_RUNNING = "Running"; - String EXECUTION_SUCCEED = "Succeed"; - String EXECUTION_FAILED = "Failed"; - String EXECUTION_CANCELLED = "Cancelled"; - String EXECUTION_INITED = "Inited"; - String EXECUTION_TIMEOUT = "Timeout"; - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/ViewAsyncResultData.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/ViewAsyncResultData.java deleted file mode 100644 index 85ad41c11..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/ViewAsyncResultData.java +++ /dev/null @@ -1,150 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.model; - -import java.util.List; -import java.util.Map; - -public class ViewAsyncResultData { - private String status; - private String message; - private int totalCount; - private String execId; - private int pageNo; - private int pageSize; - private int progress; - private List columns; - private List> resultList; - - public static class Column { - private String name; - private String type; - - @Override - public String toString() { - return "Column{" + - "name='" + name + '\'' + - ", type='" + type + '\'' + - '}'; - } - - public String getName() { - return name; - } - - public void setName(String name) { - this.name = name; - } - - public String getType() { - return type; - } - - public void setType(String type) { - this.type = type; - } - } - - - public String getStatus() { - return status; - } - - public void setStatus(String status) { - this.status = status; - } - - public String getMessage() { - return message; - } - - public void setMessage(String message) { - this.message = message; - } - - public int getTotalCount() { - return totalCount; - } - - public void setTotalCount(int totalCount) { - this.totalCount = totalCount; - } - - public String getExecId() { - return execId; - } - - public void setExecId(String execId) { - this.execId = execId; - } - - public int getPageNo() { - return pageNo; - } - - public void setPageNo(int pageNo) { - this.pageNo = pageNo; - } - - public int getPageSize() { - return pageSize; - } - - public void setPageSize(int pageSize) { - this.pageSize = pageSize; - } - - public int getProgress() { - return progress; - } - - public void setProgress(int progress) { - this.progress = progress; - } - - public List getColumns() { - return columns; - } - - public void setColumns(List columns) { - this.columns = columns; - } - - public List> getResultList() { - return resultList; - } - - public void setResultList(List> resultList) { - this.resultList = resultList; - } - - @Override - public String toString() { - return "ViewAsyncResultData{" + - "status='" + status + '\'' + - ", message='" + message + '\'' + - ", totalCount=" + totalCount + - ", execId='" + execId + '\'' + - ", pageNo=" + pageNo + - ", pageSize=" + pageSize + - ", progress=" + progress + - ", columns=" + columns + - ", resultList=" + resultList + - '}'; - } -} - diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/AsyncExecutionOperationStrategy.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/AsyncExecutionOperationStrategy.java deleted file mode 100644 index 702624448..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/AsyncExecutionOperationStrategy.java +++ /dev/null @@ -1,21 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionState; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.ExecutionResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -/** - * @author enjoyyin - * @date 2022-03-09 - * @since 0.5.0 - */ -public interface AsyncExecutionOperationStrategy extends OperationStrategy { - - String submit(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ExternalOperationFailedException; - - RefExecutionState state(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref, String execId) throws ExternalOperationFailedException; - - ExecutionResponseRef getAsyncResult(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref, String execId) throws ExternalOperationFailedException; - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/OperationStrategy.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/OperationStrategy.java deleted file mode 100644 index d2c5f0074..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/OperationStrategy.java +++ /dev/null @@ -1,39 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.ExportResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.QueryJumpUrlResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - - -public interface OperationStrategy { - - String getStrategyName(); - - RefJobContentResponseRef createRef(ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException; - - void deleteRef(ThirdlyRequestRef.RefJobContentRequestRefImpl visualisDeleteRequestRef) throws ExternalOperationFailedException; - - ExportResponseRef exportRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException; - - QueryJumpUrlResponseRef query(ThirdlyRequestRef.QueryJumpUrlRequestRefImpl requestRef); - - ResponseRef updateRef(ThirdlyRequestRef.UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException; - - RefJobContentResponseRef copyRef(ThirdlyRequestRef.CopyWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException; - - RefJobContentResponseRef importRef(ThirdlyRequestRef.ImportWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException; - - ResponseRef executeRef(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ExternalOperationFailedException; - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/OperationStrategyFactory.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/OperationStrategyFactory.java deleted file mode 100644 index 06b1af64d..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/OperationStrategyFactory.java +++ /dev/null @@ -1,79 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation; - - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.appconn.visualis.operation.impl.AbstractOperationStrategy; -import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; -import com.webank.wedatasphere.dss.standard.common.desc.AppInstance; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationWarnException; -import com.webank.wedatasphere.dss.standard.common.utils.AppStandardClassUtils; - -import java.util.*; -import java.util.function.Supplier; - -public class OperationStrategyFactory { - - private static OperationStrategyFactory factory; - private static SSORequestOperation ssoRequestOperation; - private static final Map> operationStrategies = new HashMap<>(); - private static final Map> operationStrategyClasses = new HashMap<>(); - - private OperationStrategyFactory() { - AppStandardClassUtils.getInstance(VisualisAppConn.VISUALIS_APPCONN_NAME).getClasses(OperationStrategy.class).forEach(clazz -> { - try { - operationStrategyClasses.put(clazz.newInstance().getStrategyName(), clazz); - } catch (InstantiationException | IllegalAccessException e) { - throw new ExternalOperationWarnException(50700, "Instance " + clazz + " of visualis OperationStrategy failed.", e); - } - }); - } - - - public OperationStrategy getOperationStrategy(AppInstance appInstance, String strategyName) throws ExternalOperationFailedException { - if(!operationStrategies.containsKey(appInstance)) { - synchronized (operationStrategies) { - if(!operationStrategies.containsKey(appInstance)) { - operationStrategies.put(appInstance, new ArrayList<>()); - } - } - } - List strategies = operationStrategies.get(appInstance); - Supplier> function = () -> strategies.stream() - .filter(strategy -> strategy.getStrategyName().equals(strategyName)).findAny(); - return function.get().orElseGet(() -> { - synchronized (strategies) { - return function.get().orElseGet(() -> { - OperationStrategy operationStrategy; - try { - operationStrategy = operationStrategyClasses.get(strategyName).newInstance(); - } catch (InstantiationException | IllegalAccessException e) { - throw new ExternalOperationWarnException(50700, "Instance " + strategyName + " of visualis OperationStrategy failed.", e); - } - if(operationStrategy instanceof AbstractOperationStrategy) { - ((AbstractOperationStrategy) operationStrategy).setBaseUrl(appInstance.getBaseUrl()); - ((AbstractOperationStrategy) operationStrategy).setSsoRequestOperation(ssoRequestOperation); - } - strategies.add(operationStrategy); - return operationStrategy; - }); - } - }); - } - - - public static OperationStrategyFactory getInstance() { - if (factory == null) { - synchronized (OperationStrategyFactory.class) { - if (factory == null) { - factory = new OperationStrategyFactory(); - } - } - } - return factory; - } - - public static void setSsoRequestOperation(SSORequestOperation ssoRequestOperation) { - OperationStrategyFactory.ssoRequestOperation = ssoRequestOperation; - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisDevelopmentOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisDevelopmentOperation.java deleted file mode 100644 index e51819e06..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisDevelopmentOperation.java +++ /dev/null @@ -1,37 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.standard.app.development.operation.AbstractDevelopmentOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.DevelopmentRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; -import com.webank.wedatasphere.dss.standard.common.desc.AppInstance; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -/** - * @author enjoyyin - * @date 2022-03-09 - * @since 0.5.0 - */ -public class VisualisDevelopmentOperation, V extends ResponseRef> - extends AbstractDevelopmentOperation { - - /** - * I override this method, since I want to use SSORequestOperation to request Visualis server. - * @return visualis appConn name - */ - @Override - protected String getAppConnName() { - return VisualisAppConn.VISUALIS_APPCONN_NAME; - } - - protected DevelopmentService getDevelopmentService() { - return (DevelopmentService) service; - } - - protected AppInstance getAppInstance() { - return service.getAppInstance(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCopyOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCopyOperation.java deleted file mode 100644 index 6c60a6c05..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCopyOperation.java +++ /dev/null @@ -1,34 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.development.operation.RefCopyOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -public class VisualisRefCopyOperation - extends VisualisDevelopmentOperation - implements RefCopyOperation { - - @Override - public RefJobContentResponseRef copyRef(ThirdlyRequestRef.CopyWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - String url = getBaseUrl() + URLUtils.PROJECT_COPY_URL; - String nodeType = requestRef.getType().toLowerCase(); - logger.info("The {} of Visualis try to copy ref RefJobContent: {} in url {}.", nodeType, requestRef.getRefJobContent(), url); - - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(requestRef.getUserName()); - visualisPostAction.addRequestPayload("projectVersion", "v1"); - visualisPostAction.addRequestPayload("flowVersion", requestRef.getNewVersion()); - visualisPostAction.addRequestPayload("contextID", requestRef.getContextId()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - return OperationStrategyFactory.getInstance() - .getOperationStrategy(getDevelopmentService().getAppInstance(), nodeType) - .copyRef(requestRef, url, visualisPostAction); - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCreationOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCreationOperation.java deleted file mode 100644 index 7bbca06f5..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCreationOperation.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.standard.app.development.operation.RefCreationOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -public class VisualisRefCreationOperation - extends VisualisDevelopmentOperation - implements RefCreationOperation { - - @Override - public RefJobContentResponseRef createRef(ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException { - String nodeType = requestRef.getType().toLowerCase(); - logger.info("The {} of Visualis try to create ref with DSSJobContent: {}.", nodeType, requestRef.getDSSJobContent()); - return OperationStrategyFactory.getInstance().getOperationStrategy(getAppInstance(), nodeType) - .createRef(requestRef); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefDeletionOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefDeletionOperation.java deleted file mode 100644 index bfd66909f..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefDeletionOperation.java +++ /dev/null @@ -1,37 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.standard.app.development.operation.RefDeletionOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - - -public class VisualisRefDeletionOperation - extends VisualisDevelopmentOperation - implements RefDeletionOperation { - - @Override - public ResponseRef deleteRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef) throws ExternalOperationFailedException { - String nodeType = requestRef.getType().toLowerCase(); - logger.info("The {} of Visualis try to delete ref RefJobContent: {}.", nodeType, requestRef.getRefJobContent()); - OperationStrategyFactory.getInstance().getOperationStrategy(getAppInstance(), nodeType) - .deleteRef(requestRef); - return ResponseRef.newExternalBuilder().success(); - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExecutionOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExecutionOperation.java deleted file mode 100644 index a971ee9ab..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExecutionOperation.java +++ /dev/null @@ -1,135 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.appconn.visualis.constant.VisualisConstant; -import com.webank.wedatasphere.dss.appconn.visualis.operation.impl.ViewOptStrategy; -import com.webank.wedatasphere.dss.appconn.visualis.operation.impl.VisualisRefExecutionAction; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionAction; -import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionState; -import com.webank.wedatasphere.dss.standard.app.development.listener.core.LongTermRefExecutionOperation; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.ExecutionResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSGetAction; -import com.webank.wedatasphere.dss.standard.common.entity.ref.InternalResponseRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -public class VisualisRefExecutionOperation - extends LongTermRefExecutionOperation { - - @Override - protected RefExecutionAction submit(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef requestRef) throws ExternalOperationFailedException { - String nodeType = requestRef.getType().toLowerCase(); - String nodeName = VisualisConstant.NODE_NAME_PREFIX + nodeType; - OperationStrategy strategy = OperationStrategyFactory.getInstance().getOperationStrategy(service.getAppInstance(), nodeName); - if (isSupportAsyncExecution(requestRef, strategy)) { - return executeAsync(requestRef, strategy); - } else { - return executeSync(requestRef, strategy); - } - } - - @Override - public RefExecutionState state(RefExecutionAction action) throws ExternalOperationFailedException { - VisualisRefExecutionAction visualisRefExecutionAction = ((VisualisRefExecutionAction)action); - if(isAsyncExecution(visualisRefExecutionAction)) { - AsyncExecutionOperationStrategy strategy = (AsyncExecutionOperationStrategy) visualisRefExecutionAction.getStrategy(); - return strategy.state(visualisRefExecutionAction.getRequestRef(), visualisRefExecutionAction.getId()); - } else { - logger.warn("{} do not support async execution, turn async state to success to execute it.", visualisRefExecutionAction.getRequestRef().getType()); - visualisRefExecutionAction.getExecutionRequestRefContext().appendLog("do not support async execution, turn async state to success to execute it."); - return RefExecutionState.Success; - } - } - - @Override - public ExecutionResponseRef result(RefExecutionAction action) throws ExternalOperationFailedException { - VisualisRefExecutionAction visualisRefExecutionAction = ((VisualisRefExecutionAction)action); - if(isAsyncExecution(visualisRefExecutionAction)) { - AsyncExecutionOperationStrategy strategy = (AsyncExecutionOperationStrategy) visualisRefExecutionAction.getStrategy(); - return strategy.getAsyncResult(visualisRefExecutionAction.getRequestRef(), visualisRefExecutionAction.getId()); - } else { - ResponseRef responseRef = visualisRefExecutionAction.getStrategy() - .executeRef(visualisRefExecutionAction.getRequestRef()); - return ExecutionResponseRef.newBuilder().setResponseRef(responseRef).build(); - } - } - - /** - * I override this method, since I want to use SSORequestOperation to request Visualis server. - * @return visualis appConn name - */ - @Override - protected String getAppConnName() { - return VisualisAppConn.VISUALIS_APPCONN_NAME; - } - - private RefExecutionAction executeSync(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef requestRef, - OperationStrategy strategy) { - return createRefExecutionAction(requestRef, "-1", strategy); - } - - private boolean isAsyncExecution(VisualisRefExecutionAction action) { - return !action.getId().equals("-1"); - } - - private RefExecutionAction createRefExecutionAction(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef requestRef, - String id, OperationStrategy strategy) { - VisualisRefExecutionAction action = new VisualisRefExecutionAction(); - action.setRequestRef(requestRef); - action.setExecutionRequestRefContext(requestRef.getExecutionRequestRefContext()); - action.setId(id); - action.setStrategy(strategy); - return action; - } - - /** - * 执行异步操作 - * - * @param requestRef - * @return - */ - private RefExecutionAction executeAsync(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef requestRef, - OperationStrategy strategy) throws ExternalOperationFailedException { - AsyncExecutionOperationStrategy asyncStrategy = (AsyncExecutionOperationStrategy) strategy; - //任务发布 - String execId = asyncStrategy.submit(requestRef); - return createRefExecutionAction(requestRef, execId, asyncStrategy); - } - - private boolean isSupportAsyncExecution(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref, - OperationStrategy strategy) throws ExternalOperationFailedException { - - if(!(strategy instanceof ViewOptStrategy)) { - return false; - } - ViewOptStrategy viewStrategy = (ViewOptStrategy) strategy; - //判断数据源是不是hiveDatesouce,异步只支持hiveDatesouce,不支持jdbc - String url = URLUtils.getUrl(getBaseUrl(), URLUtils.VIEW_DATA_URL_IS__HIVE_DATA_SOURCE, viewStrategy.getId(ref.getRefJobContent())); - ref.getExecutionRequestRefContext().appendLog("dss execute view node, judge dataSource type from " + url); - DSSGetAction visualisGetAction = new DSSGetAction(); - visualisGetAction.setUser(ref.getUserName()); - visualisGetAction.setParameter("labels", ref.getDSSLabels().get(0).getValue().get("DSSEnv")); - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(ref, ssoRequestOperation, url, visualisGetAction); - return (boolean) responseRef.getValue("isLinkisDataSource"); - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExportOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExportOperation.java deleted file mode 100644 index 091ff7551..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExportOperation.java +++ /dev/null @@ -1,46 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.development.operation.RefExportOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.ExportResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -public class VisualisRefExportOperation extends VisualisDevelopmentOperation - implements RefExportOperation { - - @Override - public ExportResponseRef exportRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef) throws ExternalOperationFailedException { - String url = getBaseUrl() + URLUtils.projectUrl + "/export"; - String nodeType = requestRef.getType().toLowerCase(); - logger.info("The {} of Visualis try to export ref RefJobContent: {} in url {}.", nodeType, requestRef.getRefJobContent(), url); - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(requestRef.getUserName()); - visualisPostAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - visualisPostAction.addRequestPayload("partial", true); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - return OperationStrategyFactory.getInstance().getOperationStrategy(getAppInstance(), nodeType) - .exportRef(requestRef, url, visualisPostAction); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefImportOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefImportOperation.java deleted file mode 100644 index 2adbb0ba7..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefImportOperation.java +++ /dev/null @@ -1,56 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.development.operation.RefImportOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - - -public class VisualisRefImportOperation - extends VisualisDevelopmentOperation - implements RefImportOperation { - - @Override - public RefJobContentResponseRef importRef(ThirdlyRequestRef.ImportWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - String url = getBaseUrl() + URLUtils.projectUrl + "/import"; - String nodeType = requestRef.getType().toLowerCase(); - logger.info("The {} of Visualis try to import ref RefJobContent: {} in url {}.", nodeType, requestRef.getRefJobContent(), url); - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(requestRef.getUserName()); - if(null == requestRef.getRefProjectId()){ - throw new ExternalOperationFailedException(100067,"导入节点Visualis工程ID为空"); - } - visualisPostAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - visualisPostAction.addRequestPayload("projectVersion", "v1"); - visualisPostAction.addRequestPayload("flowVersion", requestRef.getNewVersion()); - visualisPostAction.addRequestPayload("resourceId", requestRef.getResourceMap().get(ThirdlyRequestRef.ImportWitContextRequestRefImpl.RESOURCE_ID_KEY)); - visualisPostAction.addRequestPayload("version", requestRef.getResourceMap().get(ThirdlyRequestRef.ImportWitContextRequestRefImpl.RESOURCE_VERSION_KEY)); - - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - return OperationStrategyFactory.getInstance().getOperationStrategy(getAppInstance(), nodeType) - .importRef(requestRef, url, visualisPostAction); - - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefQueryOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefQueryOperation.java deleted file mode 100644 index d33b26591..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefQueryOperation.java +++ /dev/null @@ -1,37 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryJumpUrlOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.QueryJumpUrlResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - - -public class VisualisRefQueryOperation - extends VisualisDevelopmentOperation - implements RefQueryJumpUrlOperation { - - @Override - public QueryJumpUrlResponseRef query(ThirdlyRequestRef.QueryJumpUrlRequestRefImpl ref) throws ExternalOperationFailedException { - String nodeType = ref.getType().toLowerCase(); - logger.info("The {} of Visualis try to query ref RefJobContent: {}.", nodeType, ref.getRefJobContent()); - return OperationStrategyFactory.getInstance().getOperationStrategy(getAppInstance(), nodeType) - .query(ref); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefUpdateOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefUpdateOperation.java deleted file mode 100644 index c017ece84..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefUpdateOperation.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.operation; - -import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - - -public class VisualisRefUpdateOperation - extends VisualisDevelopmentOperation - implements RefUpdateOperation { - - @Override - public ResponseRef updateRef(ThirdlyRequestRef.UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - String nodeType = requestRef.getType().toLowerCase(); - logger.info("The {} of Visualis try to update ref RefJobContent: {}.", nodeType, requestRef.getParameters()); - return OperationStrategyFactory.getInstance().getOperationStrategy(getAppInstance(), nodeType) - .updateRef(requestRef); - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/AbstractOperationStrategy.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/AbstractOperationStrategy.java deleted file mode 100644 index d5927882f..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/AbstractOperationStrategy.java +++ /dev/null @@ -1,113 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation.impl; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.OperationStrategy; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionState; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.QueryJumpUrlResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSGetAction; -import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; -import com.webank.wedatasphere.dss.standard.common.entity.ref.InternalResponseRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import org.apache.commons.io.IOUtils; -import org.apache.commons.lang.exception.ExceptionUtils; -import org.apache.linkis.common.io.resultset.ResultSetWriter; -import org.apache.linkis.httpclient.request.HttpAction; -import org.apache.linkis.httpclient.response.HttpResult; -import org.apache.linkis.storage.domain.Column; -import org.apache.linkis.storage.domain.DataType; -import org.apache.linkis.storage.resultset.table.TableMetaData; -import org.apache.linkis.storage.resultset.table.TableRecord; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.io.IOException; -import java.util.List; -import java.util.Map; - -import static com.webank.wedatasphere.dss.appconn.visualis.constant.VisualisConstant.*; - -/** - * @author enjoyyin - * @date 2022-03-08 - * @since 0.5.0 - */ -public abstract class AbstractOperationStrategy implements OperationStrategy { - - protected final Logger logger = LoggerFactory.getLogger(getClass()); - - protected SSORequestOperation ssoRequestOperation; - protected String baseUrl; - - public void setSsoRequestOperation(SSORequestOperation ssoRequestOperation) { - this.ssoRequestOperation = ssoRequestOperation; - } - - public void setBaseUrl(String baseUrl) { - this.baseUrl = baseUrl; - } - - protected QueryJumpUrlResponseRef getQueryResponseRef(ThirdlyRequestRef.QueryJumpUrlRequestRefImpl requestRef, Long projectId, - String jumpUrlFormat, String id) { - String jumpUrl = URLUtils.getUrl(baseUrl, jumpUrlFormat, projectId.toString(), id, requestRef.getName()); - String env = requestRef.getDSSLabels().stream().filter(dssLabel -> dssLabel instanceof EnvDSSLabel) - .map(dssLabel -> (EnvDSSLabel) dssLabel).findAny().get().getEnv(); - String retJumpUrl = URLUtils.getEnvUrl(jumpUrl, env); - return QueryJumpUrlResponseRef.newBuilder().setJumpUrl(retJumpUrl).success(); - } - - protected ResponseRef executeRef(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref, - String url) throws ExternalOperationFailedException { - logger.info("User {} try to execute Visualis {} with refJobContent: {} in url {}.", ref.getExecutionRequestRefContext().getSubmitUser(), - ref.getType(), ref.getRefJobContent(), url); - DSSGetAction visualisGetAction = new DSSGetAction(); - visualisGetAction.setUser(ref.getExecutionRequestRefContext().getSubmitUser()); - visualisGetAction.setParameter("labels", ref.getDSSLabels().get(0).getValue().get("DSSEnv")); - try { - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(ref, ssoRequestOperation, url, visualisGetAction); - Map resultMap = responseRef.toMap(); - List> columns = (List>) resultMap.get("columns"); - if (resultMap.get("columns") == null || columns.isEmpty()) { - ref.getExecutionRequestRefContext().appendLog("Cannot execute an empty Widget!"); - throw new ExternalOperationFailedException(90176, "Cannot execute an empty Widget!", null); - } - List> resultList = (List>) resultMap.get("resultList"); - Column[] linkisColumns = columns.stream().map(columnData -> new Column(columnData.get("name"), - DataType.toDataType(columnData.get("type").toLowerCase()), "")) - .toArray(Column[]::new); - ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createTableResultSetWriter(); - resultSetWriter.addMetaData(new TableMetaData(linkisColumns)); - for (Map recordMap : resultList) { - resultSetWriter.addRecord(new TableRecord(recordMap.values().toArray())); - } - resultSetWriter.flush(); - IOUtils.closeQuietly(resultSetWriter); - ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); - } catch (IOException e) { - ref.getExecutionRequestRefContext().appendLog("Failed to write widget resultSet to storage. Caused by: " + ExceptionUtils.getRootCauseMessage(e)); - throw new ExternalOperationFailedException(90176, "Failed to write widget resultSet to storage.", e); - } - return ResponseRef.newExternalBuilder().success(); - } - - protected RefExecutionState toRefExecutionState(String state) { - switch (state) { - case EXECUTION_FAILED: - case EXECUTION_TIMEOUT: - return RefExecutionState.Failed; - case EXECUTION_SUCCEED: - return RefExecutionState.Success; - case EXECUTION_CANCELLED: - return RefExecutionState.Killed; - case EXECUTION_SCHEDULED: - case EXECUTION_INITED: - return RefExecutionState.Accepted; - default: - return RefExecutionState.Running; - } - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/DashboardOptStrategy.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/DashboardOptStrategy.java deleted file mode 100644 index 568744aa9..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/DashboardOptStrategy.java +++ /dev/null @@ -1,230 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation.impl; - -import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.appconn.visualis.constant.VisualisConstant; -import com.webank.wedatasphere.dss.appconn.visualis.utils.NumberUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.ExportResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.QueryJumpUrlResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.utils.DSSJobContentConstant; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSDownloadAction; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPutAction; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import org.apache.commons.io.IOUtils; -import org.apache.commons.lang.StringUtils; -import org.apache.linkis.common.io.resultset.ResultSetWriter; -import org.apache.linkis.server.conf.ServerConfiguration; -import org.apache.linkis.storage.LineMetaData; -import org.apache.linkis.storage.LineRecord; - -import java.io.ByteArrayOutputStream; -import java.util.Base64; -import java.util.HashMap; -import java.util.Map; - -public class DashboardOptStrategy extends AbstractOperationStrategy { - - @Override - public String getStrategyName() { - return VisualisConstant.DASHBOARD_OPERATION_STRATEGY; - } - - @Override - public RefJobContentResponseRef createRef(ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.dashboardPortalUrl; - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(requestRef.getUserName()); - visualisPostAction.addRequestPayload("name", requestRef.getName()); - visualisPostAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - visualisPostAction.addRequestPayload("avatar", "18"); - visualisPostAction.addRequestPayload("publish", true); - visualisPostAction.addRequestPayload("description", requestRef.getDSSJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - // 执行http请求,获取响应结果 - ResponseRef dashboardPortalResponseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - return createDashboard(dashboardPortalResponseRef, requestRef); - } - - - @Override - public void deleteRef(ThirdlyRequestRef.RefJobContentRequestRefImpl visualisDeleteRequestRef) throws ExternalOperationFailedException { - String portalId = getDashboardPortalId(visualisDeleteRequestRef.getRefJobContent()); - if (StringUtils.isEmpty(portalId)) { - throw new ExternalOperationFailedException(90177, "Delete Dashboard failed for portalId id is null", null); - } - String url = baseUrl + URLUtils.dashboardPortalUrl + "/" + portalId; - // Delete协议在加入url label时会存在被nginx拦截转发情况,在这里换成Post协议对label进行兼容 - DSSPostAction deleteAction = new DSSPostAction(); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(visualisDeleteRequestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - deleteAction.addRequestPayload("labels", routeVO); - deleteAction.setUser(visualisDeleteRequestRef.getUserName()); - VisualisCommonUtil.getExternalResponseRef(visualisDeleteRequestRef, ssoRequestOperation, url, deleteAction); - } - - - private RefJobContentResponseRef createDashboard(ResponseRef dashboardCreateResponseRef, - ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException { - //dashboardCreateResponseRef保存有dashboardPortal的值,此时从dashboardPortal中取id作为dashboardPortalId - String portalId = NumberUtils.parseDoubleString(dashboardCreateResponseRef.toMap().get("id").toString()); - String url = baseUrl + URLUtils.dashboardPortalUrl + "/" + portalId + "/dashboards"; - - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(requestRef.getUserName()); - visualisPostAction.addRequestPayload("config", ""); - visualisPostAction.addRequestPayload("dashboardPortalId", Long.parseLong(portalId)); - visualisPostAction.addRequestPayload("index", 0); - visualisPostAction.addRequestPayload("name", requestRef.getName()); - visualisPostAction.addRequestPayload("parentId", 0); - visualisPostAction.addRequestPayload("type", 1); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - return VisualisCommonUtil.getRefJobContentResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - } - - - @Override - public ExportResponseRef exportRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - String portalId = getDashboardPortalId(requestRef.getRefJobContent()); - if (StringUtils.isEmpty(portalId)) { - throw new ExternalOperationFailedException(90177, "export Dashboard failed for portalId id is null", null); - } - visualisPostAction.addRequestPayload("dashboardPortalIds", Long.parseLong(NumberUtils.parseDoubleString(portalId))); - return VisualisCommonUtil.getExportResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - } - - - @Override - public QueryJumpUrlResponseRef query(ThirdlyRequestRef.QueryJumpUrlRequestRefImpl visualisOpenRequestRef) { - String dashboardId = getDashboardPortalId(visualisOpenRequestRef.getRefJobContent()); - Long projectId = visualisOpenRequestRef.getRefProjectId(); - return getQueryResponseRef(visualisOpenRequestRef, projectId, URLUtils.DASHBOARD_JUMP_URL_FORMAT, dashboardId); - } - - - @Override - public ResponseRef updateRef(ThirdlyRequestRef.UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - String id = getDashboardPortalId(requestRef.getRefJobContent()); - if (StringUtils.isEmpty(id)) { - throw new ExternalOperationFailedException(90177, "Update Dashboard Exception, id is null"); - } - String url = baseUrl + URLUtils.dashboardPortalUrl + "/" + id; - DSSPutAction putAction = new DSSPutAction(); - putAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - putAction.addRequestPayload("name", requestRef.getName()); - putAction.addRequestPayload("id", Long.parseLong(id)); - putAction.addRequestPayload("avatar", "9"); - putAction.addRequestPayload("description", requestRef.getRefJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - putAction.addRequestPayload("publish", true); - putAction.addRequestPayload("roleIds", Lists.newArrayList()); - putAction.setUser(requestRef.getUserName()); - - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - putAction.addRequestPayload("labels", routeVO); - - return VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, putAction); - } - - - @Override - public RefJobContentResponseRef copyRef(ThirdlyRequestRef.CopyWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - String oldDashboardPortalId = getDashboardPortalId(requestRef.getRefJobContent()); - visualisPostAction.addRequestPayload(VisualisConstant.DASHBOARD_PORTAL_IDS, oldDashboardPortalId); - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - - String dashboardId = NumberUtils.parseDoubleString(requestRef.getRefJobContent().get("id").toString()); - @SuppressWarnings("unchecked") - Map dashboardData = (Map) responseRef.toMap().get("dashboard"); - Map refJobContent = new HashMap<>(2); - refJobContent.put("id", Double.parseDouble(dashboardData.get(dashboardId).toString())); - refJobContent.put("projectId", requestRef.getRefProjectId()); - - //dashboardPortal - String dashboardPortalId = NumberUtils.parseDoubleString(requestRef.getRefJobContent().get("dashboardPortalId").toString()); - @SuppressWarnings("unchecked") - Map dashboardPortalData = (Map) responseRef.toMap().get("dashboardPortal"); - refJobContent.put("dashboardPortalId", Double.parseDouble(dashboardPortalData.get(dashboardPortalId).toString())); - return RefJobContentResponseRef.newBuilder().setRefJobContent(refJobContent).success(); - } - - - @Override - public RefJobContentResponseRef importRef(ThirdlyRequestRef.ImportWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - Map jobContent = new HashMap<>(3); - jobContent.put("projectId", requestRef.getRefProjectId()); - String dashboardPortalId = getDashboardPortalId(requestRef.getRefJobContent()); - @SuppressWarnings("unchecked") - Map dashboardPortal = (Map) responseRef.toMap().get("dashboardPortal"); - jobContent.put("dashboardPortalId", Double.parseDouble(dashboardPortal.get(dashboardPortalId).toString())); - - String dashboardId = NumberUtils.parseDoubleString(requestRef.getRefJobContent().get("id").toString()); - Map dashboard = (Map) responseRef.toMap().get("dashboard"); - jobContent.put("id", Double.parseDouble(dashboard.get(dashboardId).toString())); - return RefJobContentResponseRef.newBuilder().setRefJobContent(jobContent).success(); - } - - @Override - public ResponseRef executeRef(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ExternalOperationFailedException { - String previewUrl = URLUtils.getUrl(baseUrl, URLUtils.DASHBOARD_PREVIEW_URL_FORMAT, getDashboardPortalId(ref.getRefJobContent())); - logger.info("The {} of Visualis try to execute ref RefJobContent: {} in previewUrl {}.", ref.getType(), ref.getRefJobContent(), previewUrl); - ref.getExecutionRequestRefContext().appendLog(String.format("The %s of Visualis try to execute ref RefJobContent: %s in previewUrl %s.", ref.getType(), ref.getRefJobContent(), previewUrl)); - DSSDownloadAction previewDownloadAction = new DSSDownloadAction(); - previewDownloadAction.setUser(ref.getUserName()); - previewDownloadAction.setParameter("labels", ((EnvDSSLabel) (ref.getDSSLabels().get(0))).getEnv()); - - DSSDownloadAction metadataDownloadAction = new DSSDownloadAction(); - metadataDownloadAction.setUser(ref.getUserName()); - metadataDownloadAction.setParameter("labels", ref.getDSSLabels().get(0).getValue().get("DSSEnv")); - try { - VisualisCommonUtil.getHttpResult(ref, ssoRequestOperation, previewUrl, previewDownloadAction); - ByteArrayOutputStream os = new ByteArrayOutputStream(); - IOUtils.copy(previewDownloadAction.getInputStream(), os); - String response = new String(Base64.getEncoder().encode(os.toByteArray())); - - String metaUrl = URLUtils.getUrl(baseUrl, URLUtils.DASHBOARD_METADATA_URL_FORMAT, getDashboardPortalId(ref.getRefJobContent())); - logger.info("The {} of Visualis try to execute ref RefJobContent: {} in metaUrl {}.", ref.getType(), ref.getRefJobContent(), previewUrl); - ref.getExecutionRequestRefContext().appendLog(String.format("The %s of Visualis try to execute ref RefJobContent: %s in metaUrl %s.", ref.getType(), ref.getRefJobContent(), previewUrl)); - VisualisCommonUtil.getHttpResult(ref, ssoRequestOperation, metaUrl, metadataDownloadAction); - String metadata = org.apache.commons.lang3.StringUtils.chomp(IOUtils.toString(metadataDownloadAction.getInputStream(), ServerConfiguration.BDP_SERVER_ENCODING().getValue())); - ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createPictureResultSetWriter(); - resultSetWriter.addMetaData(new LineMetaData(metadata)); - resultSetWriter.addRecord(new LineRecord(response)); - resultSetWriter.flush(); - IOUtils.closeQuietly(resultSetWriter); - ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); - } catch (Throwable e) { - ref.getExecutionRequestRefContext().appendLog("Failed to execute Dashboard url " + previewUrl); - throw new ExternalOperationFailedException(90176, "Failed to debug Dashboard", e); - } finally { - IOUtils.closeQuietly(previewDownloadAction); - IOUtils.closeQuietly(metadataDownloadAction); - } - return ResponseRef.newExternalBuilder().success(); - } - - private String getDashboardPortalId(Map refJobContent) { - String dashboardPortalId = refJobContent.get("dashboardPortalId").toString(); - return NumberUtils.parseDoubleString(dashboardPortalId); - } - - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/DisplayOptStrategy.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/DisplayOptStrategy.java deleted file mode 100644 index e262d5972..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/DisplayOptStrategy.java +++ /dev/null @@ -1,206 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation.impl; - - -import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.appconn.visualis.constant.VisualisConstant; -import com.webank.wedatasphere.dss.appconn.visualis.utils.NumberUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.ExportResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.QueryJumpUrlResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.utils.DSSJobContentConstant; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSDownloadAction; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPutAction; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import org.apache.commons.io.IOUtils; -import org.apache.commons.lang3.StringUtils; -import org.apache.linkis.common.io.resultset.ResultSetWriter; -import org.apache.linkis.server.conf.ServerConfiguration; -import org.apache.linkis.storage.LineMetaData; -import org.apache.linkis.storage.LineRecord; - -import java.io.ByteArrayOutputStream; -import java.util.Base64; -import java.util.HashMap; -import java.util.Map; - -public class DisplayOptStrategy extends AbstractOperationStrategy { - - @Override - public String getStrategyName() { - return VisualisConstant.DISPLAY_OPERATION_STRATEGY; - } - - @Override - public RefJobContentResponseRef createRef(ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.displayUrl; - logger.info("requestUrl:{}", url); - - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(requestRef.getUserName()); - visualisPostAction.addRequestPayload("name", requestRef.getName()); - visualisPostAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - visualisPostAction.addRequestPayload("avatar", "18"); - visualisPostAction.addRequestPayload("publish", true); - visualisPostAction.addRequestPayload("description", requestRef.getDSSJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - - // 执行http请求,获取响应结果 - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - String displayId = responseRef.toMap().get("id").toString(); - Map jobContent = new HashMap<>(1); - jobContent.put("displayId", displayId); - createDisplaySlide(displayId, requestRef); - return RefJobContentResponseRef.newBuilder().setRefJobContent(jobContent).success(); - } - - @Override - public void deleteRef(ThirdlyRequestRef.RefJobContentRequestRefImpl visualisDeleteRequestRef) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.displayUrl + "/" + getDisplayId(visualisDeleteRequestRef.getRefJobContent()); - // Delete协议在加入url label时会存在被nginx拦截转发情况,在这里换成Post协议对label进行兼容 - DSSPostAction deleteAction = new DSSPostAction(); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(visualisDeleteRequestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - deleteAction.addRequestPayload("labels", routeVO); - deleteAction.setUser(visualisDeleteRequestRef.getUserName()); - VisualisCommonUtil.getExternalResponseRef(visualisDeleteRequestRef, ssoRequestOperation, url, deleteAction); - } - - - private void createDisplaySlide(String displayId, ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException { - String id = NumberUtils.parseDoubleString(displayId); - String url = baseUrl + URLUtils.displayUrl + "/" + id + "/slides"; - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(requestRef.getUserName()); - visualisPostAction.addRequestPayload("config", URLUtils.displaySlideConfig); - visualisPostAction.addRequestPayload("displayId", Long.parseLong(id)); - visualisPostAction.addRequestPayload("index", 0); - - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - } - - - @Override - public ExportResponseRef exportRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef, - String url, - DSSPostAction postAction) throws ExternalOperationFailedException { - postAction.addRequestPayload("displayIds", getDisplayId(requestRef.getRefJobContent())); - return VisualisCommonUtil.getExportResponseRef(requestRef, ssoRequestOperation, url, postAction); - } - - @Override - public QueryJumpUrlResponseRef query(ThirdlyRequestRef.QueryJumpUrlRequestRefImpl requestRef) { - String displayId = getDisplayId(requestRef.getRefJobContent()).toString(); - return getQueryResponseRef(requestRef, requestRef.getRefProjectId(), URLUtils.DISPLAY_JUMP_URL_FORMAT, displayId); - } - - private Long getDisplayId(Map refJobContent) { - String displayId = refJobContent.get("displayId").toString(); - return Long.parseLong(NumberUtils.parseDoubleString(displayId)); - } - - @Override - public ResponseRef updateRef(ThirdlyRequestRef.UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - long id = getDisplayId(requestRef.getRefJobContent()); - String url = baseUrl + URLUtils.displayUrl + "/" + id; - DSSPutAction putAction = new DSSPutAction(); - putAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - putAction.addRequestPayload("name", requestRef.getName()); - putAction.addRequestPayload("id", id); - putAction.addRequestPayload("avatar", "9"); - putAction.addRequestPayload("description", requestRef.getRefJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - putAction.addRequestPayload("publish", true); - putAction.addRequestPayload("roleIds", Lists.newArrayList()); - putAction.setUser(requestRef.getUserName()); - - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(((EnvDSSLabel) (requestRef.getDSSLabels().get(0))).getEnv()); - putAction.addRequestPayload("labels", routeVO); - - return VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, putAction); - } - - - @Override - public RefJobContentResponseRef copyRef(ThirdlyRequestRef.CopyWitContextRequestRefImpl requestRef, - String url, - DSSPostAction postAction) throws ExternalOperationFailedException { - Long id = getDisplayId(requestRef.getRefJobContent()); - postAction.addRequestPayload(VisualisConstant.DISPLAY_IDS, id); - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, postAction); - @SuppressWarnings("unchecked") - Map displayData = (Map) responseRef.toMap().get("display"); - Map refJobContent = new HashMap<>(1); - refJobContent.put("displayId", Double.parseDouble(displayData.get(id.toString()).toString())); - return RefJobContentResponseRef.newBuilder().setRefJobContent(refJobContent).success(); - } - - - @Override - @SuppressWarnings("unchecked") - public RefJobContentResponseRef importRef(ThirdlyRequestRef.ImportWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - Map jobContent = new HashMap<>(1); - String id = getDisplayId(requestRef.getRefJobContent()).toString(); - - Map displayData =(Map) responseRef.toMap().get("display"); - jobContent.put("displayId", Double.parseDouble(displayData.get(id).toString())); - return RefJobContentResponseRef.newBuilder().setRefJobContent(jobContent).success(); - } - - - @Override - public ResponseRef executeRef(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ExternalOperationFailedException { - String previewUrl = URLUtils.getUrl(baseUrl, URLUtils.DISPLAY_PREVIEW_URL_FORMAT, getDisplayId(ref.getRefJobContent()).toString()); - logger.info("User {} try to execute Visualis display with refJobContent: {} in previewUrl {}.", ref.getExecutionRequestRefContext().getSubmitUser(), - ref.getRefJobContent(), previewUrl); - ref.getExecutionRequestRefContext().appendLog(String.format("The %s of Visualis try to execute ref RefJobContent: %s in previewUrl %s.", ref.getType(), ref.getRefJobContent(), previewUrl)); - DSSDownloadAction previewDownloadAction = new DSSDownloadAction(); - previewDownloadAction.setUser(ref.getExecutionRequestRefContext().getSubmitUser()); - previewDownloadAction.setParameter("labels", ((EnvDSSLabel) (ref.getDSSLabels().get(0))).getEnv()); - - DSSDownloadAction metadataDownloadAction = new DSSDownloadAction(); - metadataDownloadAction.setUser(ref.getExecutionRequestRefContext().getSubmitUser()); - metadataDownloadAction.setParameter("labels", ((EnvDSSLabel) (ref.getDSSLabels().get(0))).getEnv()); - - try { - VisualisCommonUtil.getHttpResult(ref, ssoRequestOperation, previewUrl, previewDownloadAction); - ByteArrayOutputStream os = new ByteArrayOutputStream(); - IOUtils.copy(previewDownloadAction.getInputStream(), os); - String response = new String(Base64.getEncoder().encode(os.toByteArray())); - - String metaUrl = URLUtils.getUrl(baseUrl, URLUtils.DISPLAY_METADATA_URL_FORMAT, getDisplayId(ref.getRefJobContent()).toString()); - VisualisCommonUtil.getHttpResult(ref, ssoRequestOperation, metaUrl, metadataDownloadAction); - String metadata = StringUtils.chomp(IOUtils.toString(metadataDownloadAction.getInputStream(), ServerConfiguration.BDP_SERVER_ENCODING().getValue())); - ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createPictureResultSetWriter(); - resultSetWriter.addMetaData(new LineMetaData(metadata)); - resultSetWriter.addRecord(new LineRecord(response)); - resultSetWriter.flush(); - IOUtils.closeQuietly(resultSetWriter); - ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); - } catch (Throwable e) { - ref.getExecutionRequestRefContext().appendLog("Failed to debug Display url " + previewUrl); - throw new ExternalOperationFailedException(90176, "Failed to debug Display", e); - } finally { - IOUtils.closeQuietly(previewDownloadAction); - IOUtils.closeQuietly(metadataDownloadAction); - } - return ResponseRef.newExternalBuilder().success(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/ViewOptStrategy.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/ViewOptStrategy.java deleted file mode 100644 index 7f29fcc69..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/ViewOptStrategy.java +++ /dev/null @@ -1,265 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation.impl; - -import com.google.common.collect.Lists; -import com.webank.wedatasphere.dss.appconn.visualis.constant.VisualisConstant; -import com.webank.wedatasphere.dss.appconn.visualis.model.ViewAsyncResultData; -import com.webank.wedatasphere.dss.appconn.visualis.operation.AsyncExecutionOperationStrategy; -import com.webank.wedatasphere.dss.appconn.visualis.utils.NumberUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionState; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.ExecutionResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.ExportResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.QueryJumpUrlResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.utils.DSSJobContentConstant; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSGetAction; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPutAction; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import org.apache.commons.io.IOUtils; -import org.apache.commons.lang.StringUtils; -import org.apache.commons.lang.exception.ExceptionUtils; -import org.apache.linkis.common.exception.ErrorException; -import org.apache.linkis.common.io.resultset.ResultSetWriter; -import org.apache.linkis.cs.client.ContextClient; -import org.apache.linkis.cs.client.builder.ContextClientFactory; -import org.apache.linkis.cs.client.utils.SerializeHelper; -import org.apache.linkis.cs.common.entity.enumeration.ContextType; -import org.apache.linkis.cs.common.entity.source.ContextID; -import org.apache.linkis.cs.common.utils.CSCommonUtils; -import org.apache.linkis.server.BDPJettyServerHelper; -import org.apache.linkis.storage.domain.Column; -import org.apache.linkis.storage.domain.DataType; -import org.apache.linkis.storage.resultset.table.TableMetaData; -import org.apache.linkis.storage.resultset.table.TableRecord; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Optional; - -public class ViewOptStrategy extends AbstractOperationStrategy implements AsyncExecutionOperationStrategy { - private final static Logger logger = LoggerFactory.getLogger(ViewOptStrategy.class); - - @Override - public String getStrategyName() { - return VisualisConstant.VIEW_OPERATION_STRATEGY; - } - - @Override - public RefJobContentResponseRef createRef(ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException { - - String url = baseUrl + URLUtils.VIEW_URL; - DSSPostAction postAction = new DSSPostAction(); - postAction.setUser(requestRef.getUserName()); - postAction.addRequestPayload("name", requestRef.getName()); - postAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - postAction.addRequestPayload("description", requestRef.getDSSJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - postAction.addRequestPayload("sourceId", 0); - postAction.addRequestPayload("config", ""); - postAction.addRequestPayload("sql", ""); - postAction.addRequestPayload("model", ""); - postAction.addRequestPayload("roles", Lists.newArrayList()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute((requestRef.getDSSLabels().get(0).getValue().get("DSSEnv"))); - postAction.addRequestPayload("labels", routeVO); - - // 执行http请求,获取响应结果 - return VisualisCommonUtil.getRefJobContentResponseRef(requestRef, ssoRequestOperation, url, postAction); - } - - - @Override - public void deleteRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.VIEW_URL + "/" + getId(requestRef.getRefJobContent()); - // Delete协议在加入url label时会存在被nginx拦截转发情况,在这里换成Post协议对label进行兼容 - DSSPostAction deleteAction = new DSSPostAction(); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - deleteAction.addRequestPayload("labels", routeVO); - deleteAction.setUser(requestRef.getUserName()); - VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, deleteAction); - } - - - @Override - public ExportResponseRef exportRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef, String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - visualisPostAction.addRequestPayload("viewIds", Double.parseDouble(getId(requestRef.getRefJobContent()))); - return VisualisCommonUtil.getExportResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - } - - @Override - public QueryJumpUrlResponseRef query(ThirdlyRequestRef.QueryJumpUrlRequestRefImpl requestRef) { - String id = getId(requestRef.getRefJobContent()); - return getQueryResponseRef(requestRef, requestRef.getRefProjectId(), URLUtils.VIEW_JUMP_URL_FORMAT, id); - } - - @Override - public ResponseRef updateRef(ThirdlyRequestRef.UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - String id = getId(requestRef.getRefJobContent()); - String url = baseUrl + URLUtils.VIEW_URL + "/" + id; - logger.info("requestUrl: {}.", url); - DSSPutAction putAction = new DSSPutAction(); - putAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - putAction.addRequestPayload("name", requestRef.getName()); - putAction.addRequestPayload("id", Long.parseLong(id)); - putAction.addRequestPayload("description", requestRef.getRefJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - putAction.setUser(requestRef.getUserName()); - - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - putAction.addRequestPayload("labels", routeVO); - - return VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, putAction); - } - - - @Override - public RefJobContentResponseRef copyRef(ThirdlyRequestRef.CopyWitContextRequestRefImpl requestRef, - String url, - DSSPostAction postAction) throws ExternalOperationFailedException { - - postAction.addRequestPayload(VisualisConstant.VIEW_IDS, getId(requestRef.getRefJobContent())); - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, postAction); - String id = getId(requestRef.getRefJobContent()); - @SuppressWarnings("unchecked") - Map viewData = (Map) responseRef.toMap().get("view"); - Map jobContent = new HashMap<>(2); - jobContent.put("id", Double.parseDouble(viewData.get(id).toString())); - return RefJobContentResponseRef.newBuilder().setRefJobContent(jobContent).success(); - } - - - @Override - public RefJobContentResponseRef importRef(ThirdlyRequestRef.ImportWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - Map jobContent = new HashMap<>(2); - String id = getId(requestRef.getRefJobContent()); - - @SuppressWarnings("unchecked") - Map viewData = (Map) responseRef.toMap().get("view"); - jobContent.put("projectId", requestRef.getParameter("projectId")); - jobContent.put("id", Double.parseDouble(viewData.get(id).toString())); - return RefJobContentResponseRef.newBuilder().setRefJobContent(jobContent).success(); - } - - @Override - public ResponseRef executeRef(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ExternalOperationFailedException { - String url = URLUtils.getUrl(baseUrl, URLUtils.VIEW_DATA_URL_FORMAT, getId(ref.getRefJobContent())); - ResponseRef responseRef = executeRef(ref, url); - try { - cleanCSTabel(ref); - } catch (ErrorException e) { - ref.getExecutionRequestRefContext().appendLog("ERROR: Failed to clean cs tables. Caused by: " + ExceptionUtils.getRootCauseMessage(e)); - throw new ExternalOperationFailedException(90176, "Failed to clean cs tables.", e); - } - return responseRef; - } - - @Override - public String submit(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ExternalOperationFailedException { - String url = URLUtils.getUrl(baseUrl, URLUtils.VIEW_DATA_URL_SUBMIT, getId(ref.getRefJobContent())); - logger.info("User {} try to submit Visualis view with refJobContent: {} in url {}.", ref.getExecutionRequestRefContext().getSubmitUser(), - ref.getRefJobContent(), url); - ref.getExecutionRequestRefContext().appendLog("dss execute view node, ready to submit to " + url); - DSSGetAction visualisGetAction = new DSSGetAction(); - visualisGetAction.setUser(ref.getUserName()); - visualisGetAction.setParameter("labels", ref.getDSSLabels().get(0).getValue().get("DSSEnv")); - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(ref, ssoRequestOperation, url, visualisGetAction); - Map paginateWithExecStatusMap = (Map) responseRef.toMap().get("paginateWithExecStatus"); - return paginateWithExecStatusMap.get("execId").toString(); - } - - @Override - public RefExecutionState state(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref, String execId) throws ExternalOperationFailedException { - if (StringUtils.isEmpty(execId)) { - ref.getExecutionRequestRefContext().appendLog("dss execute view error for execId is null when get state!"); - throw new ExternalOperationFailedException(90176, "dss execute view error when get state"); - } - String url = URLUtils.getUrl(baseUrl, URLUtils.VIEW_DATA_URL_STATE, execId); - ref.getExecutionRequestRefContext().appendLog("dss execute view node, ready to get state from " + url); - - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(ref.getExecutionRequestRefContext().getSubmitUser()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(ref.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(ref, ssoRequestOperation, url, visualisPostAction); - ref.getExecutionRequestRefContext().appendLog(responseRef.getResponseBody()); - - String status = responseRef.toMap().get("status").toString(); - return toRefExecutionState(status); - } - - @Override - public ExecutionResponseRef getAsyncResult(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref, String execId) throws ExternalOperationFailedException { - if (StringUtils.isEmpty(execId)) { - ref.getExecutionRequestRefContext().appendLog("dss execute view error for execId is null when get result!"); - throw new ExternalOperationFailedException(90176, "dss execute view error when get result"); - } - String url = URLUtils.getUrl(baseUrl, URLUtils.VIEW_DATA_URL_ASYNC_RESULT, execId); - ref.getExecutionRequestRefContext().appendLog("dss execute view node,ready to get result set from " + url); - DSSPostAction visualisPostAction = new DSSPostAction(); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(ref.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - - visualisPostAction.setUser(ref.getExecutionRequestRefContext().getSubmitUser()); - try { - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(ref, ssoRequestOperation, url, visualisPostAction); - ViewAsyncResultData responseData = BDPJettyServerHelper.gson().fromJson(BDPJettyServerHelper.gson().toJson(responseRef.toMap()), ViewAsyncResultData.class); - ref.getExecutionRequestRefContext().appendLog("get responseData success."); - List oldColumns = Optional.of(responseData.getColumns()).orElseThrow(() -> new ExternalOperationFailedException(90176, "DSS execute view node failed,responseData is empty", null)); - if (oldColumns.isEmpty()) { - ref.getExecutionRequestRefContext().appendLog("dss execute view node failed,columns is empty"); - throw new ExternalOperationFailedException(90176, "dss execute view node failed,columns is empty", null); - } - List columns = Lists.newArrayList(); - for (ViewAsyncResultData.Column columnData : oldColumns) { - columns.add(new Column(columnData.getName(), DataType.toDataType(columnData.getType().toLowerCase()), "")); - } - ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createTableResultSetWriter(); - resultSetWriter.addMetaData(new TableMetaData(columns.toArray(new Column[0]))); - List> oldResultList = Optional.of(responseData.getResultList()).orElseThrow(() -> new ExternalOperationFailedException(90176, "DSS execute view node failed,resultList is empty", null)); - for (Map recordMap : oldResultList) { - resultSetWriter.addRecord(new TableRecord(recordMap.values().toArray())); - } - resultSetWriter.flush(); - IOUtils.closeQuietly(resultSetWriter); - ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); - cleanCSTabel(ref); - } catch (Throwable e) { - ref.getExecutionRequestRefContext().appendLog("dss execute view node failed,url:" + url); - ref.getExecutionRequestRefContext().appendLog(e.getMessage()); - throw new ExternalOperationFailedException(90176, "dss execute view node failed", e); - } - return ExecutionResponseRef.newBuilder().success(); - } - - - public String getId(Map requestRef) { - return NumberUtils.parseDoubleString(requestRef.get("id").toString()); - } - - // 为了实现View节点不产生CS表,对View执行产生的CS表进行清理 - private void cleanCSTabel(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ErrorException { - String contextIdStr = ref.getContextId(); - String nodeName = ref.getName(); - ContextID contextID = SerializeHelper.deserializeContextID(contextIdStr); - ContextClient contextClient = ContextClientFactory.getOrCreateContextClient(); - contextClient.removeAllValueByKeyPrefixAndContextType(contextID, ContextType.METADATA, CSCommonUtils.NODE_PREFIX + nodeName); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/VisualisRefExecutionAction.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/VisualisRefExecutionAction.java deleted file mode 100644 index 32462ca00..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/VisualisRefExecutionAction.java +++ /dev/null @@ -1,32 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation.impl; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.OperationStrategy; -import com.webank.wedatasphere.dss.standard.app.development.listener.common.AbstractRefExecutionAction; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; - -/** - * @author enjoyyin - * @date 2022-03-09 - * @since 0.5.0 - */ -public class VisualisRefExecutionAction extends AbstractRefExecutionAction { - - private RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef requestRef; - private OperationStrategy strategy; - - public RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef getRequestRef() { - return requestRef; - } - - public void setRequestRef(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef requestRef) { - this.requestRef = requestRef; - } - - public OperationStrategy getStrategy() { - return strategy; - } - - public void setStrategy(OperationStrategy strategy) { - this.strategy = strategy; - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/WidgetOptStrategy.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/WidgetOptStrategy.java deleted file mode 100644 index 688228702..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/impl/WidgetOptStrategy.java +++ /dev/null @@ -1,174 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.operation.impl; - -import com.webank.wedatasphere.dss.appconn.visualis.constant.VisualisConstant; -import com.webank.wedatasphere.dss.appconn.visualis.utils.NumberUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.entity.node.DSSNode; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; -import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.*; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.utils.DSSJobContentConstant; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import org.apache.linkis.cs.common.utils.CSCommonUtils; - -import java.util.HashMap; -import java.util.List; -import java.util.Map; - -public class WidgetOptStrategy extends AbstractOperationStrategy { - - @Override - public String getStrategyName() { - return VisualisConstant.WIDGET_OPERATION_STRATEGY; - } - - @Override - public RefJobContentResponseRef createRef(ThirdlyRequestRef.DSSJobContentWithContextRequestRef requestRef) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.widgetUrl; - DSSPostAction postAction = new DSSPostAction(); - postAction.setUser(requestRef.getUserName()); - postAction.addRequestPayload("widgetName", requestRef.getName()); - postAction.addRequestPayload("projectId", requestRef.getRefProjectId()); - postAction.addRequestPayload("description", requestRef.getDSSJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - postAction.addRequestPayload(CSCommonUtils.CONTEXT_ID_STR, requestRef.getContextId()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(((EnvDSSLabel) (requestRef.getDSSLabels().get(0))).getEnv()); - postAction.addRequestPayload("labels", routeVO); - if (requestRef.getDSSJobContent().containsKey(DSSJobContentConstant.UP_STREAM_KEY)) { - List dssNodes = (List) requestRef.getDSSJobContent().get(DSSJobContentConstant.UP_STREAM_KEY); - postAction.addRequestPayload(CSCommonUtils.NODE_NAME_STR, dssNodes.get(0).getName()); - } - // 执行http请求,获取响应结果 - RefJobContentResponseRef responseRef = VisualisCommonUtil.getRefJobContentResponseRef(requestRef, ssoRequestOperation, url, postAction); - // update cs - updateCsRef(requestRef, DSSCommonUtils.parseToLong(responseRef.getRefJobContent().get("widgetId")), requestRef.getContextId()); - return responseRef; - } - - @Override - public void deleteRef(ThirdlyRequestRef.RefJobContentRequestRefImpl visualisDeleteRequestRef) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.widgetDeleteUrl + "/" + getWidgetId(visualisDeleteRequestRef.getRefJobContent()); - // Delete协议在加入url label时会存在被nginx拦截转发情况,在这里换成Post协议对label进行兼容 - DSSPostAction deleteAction = new DSSPostAction(); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(((EnvDSSLabel) (visualisDeleteRequestRef.getDSSLabels().get(0))).getEnv()); - deleteAction.addRequestPayload("labels", routeVO); - deleteAction.setUser(visualisDeleteRequestRef.getUserName()); - VisualisCommonUtil.getExternalResponseRef(visualisDeleteRequestRef, ssoRequestOperation, url, deleteAction); - } - - - @Override - public ExportResponseRef exportRef(ThirdlyRequestRef.RefJobContentRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - visualisPostAction.addRequestPayload("widgetIds", Long.parseLong(getWidgetId(requestRef.getRefJobContent()))); - return VisualisCommonUtil.getExportResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - } - - - @Override - public QueryJumpUrlResponseRef query(ThirdlyRequestRef.QueryJumpUrlRequestRefImpl visualisOpenRequestRef) { - String widgetId = getWidgetId(visualisOpenRequestRef.getRefJobContent()); - return getQueryResponseRef(visualisOpenRequestRef, visualisOpenRequestRef.getRefProjectId(), URLUtils.WIDGET_JUMP_URL_FORMAT, widgetId); - } - - @Override - public ResponseRef updateRef(ThirdlyRequestRef.UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.widgetUpdateUrl; - DSSPostAction postAction = new DSSPostAction(); - try { - postAction.addRequestPayload("id", Long.parseLong(getWidgetId(requestRef.getRefJobContent()))); - } catch (Exception e) { - throw new ExternalOperationFailedException(90177, "Update Widget Exception", e); - } - postAction.addRequestPayload("name", requestRef.getName()); - postAction.addRequestPayload("description", requestRef.getRefJobContent().get(DSSJobContentConstant.NODE_DESC_KEY)); - postAction.setUser(requestRef.getUserName()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(((EnvDSSLabel) (requestRef.getDSSLabels().get(0))).getEnv()); - postAction.addRequestPayload("labels", routeVO); - return VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, postAction); - } - - - @Override - public RefJobContentResponseRef copyRef(ThirdlyRequestRef.CopyWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - visualisPostAction.addRequestPayload(VisualisConstant.WIDGET_IDS, getWidgetId(requestRef.getRefJobContent())); - - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - String id = getWidgetId(requestRef.getRefJobContent()); - @SuppressWarnings("unchecked") - Map widgetData = (Map) responseRef.toMap().get("widget"); - Map jobContent = new HashMap<>(1); - jobContent.put("widgetId", Double.parseDouble(widgetData.get(id).toString())); - return RefJobContentResponseRef.newBuilder().setRefJobContent(jobContent).success(); - } - - @Override - public RefJobContentResponseRef importRef(ThirdlyRequestRef.ImportWitContextRequestRefImpl requestRef, - String url, - DSSPostAction visualisPostAction) throws ExternalOperationFailedException { - - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisPostAction); - Map jobContent = new HashMap<>(); - String id = getWidgetId(requestRef.getRefJobContent()); - - @SuppressWarnings("unchecked") - Map widgetData = (Map) responseRef.toMap().get("widget"); - long newId = DSSCommonUtils.parseToLong(widgetData.get(id)); - jobContent.put("widgetId", newId); - requestRef.getRefJobContent().put("widgetId", newId); - // cs更新 - updateCsRef(requestRef, requestRef.getContextId()); - return RefJobContentResponseRef.newBuilder().setRefJobContent(jobContent).success(); - } - - - @Override - public ResponseRef executeRef(RefExecutionRequestRef.RefExecutionProjectWithContextRequestRef ref) throws ExternalOperationFailedException { - String url = URLUtils.getUrl(baseUrl, URLUtils.WIDGET_DATA_URL_FORMAT, getWidgetId(ref.getRefJobContent())); - return executeRef(ref, url); - } - - private String getWidgetId(Map refJobContent) { - return NumberUtils.parseDoubleString(refJobContent.get("widgetId").toString()); - } - - private ResponseRef updateCsRef(RefJobContentRequestRef requestRef, - String contextId) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.widgetContextUrl; - DSSPostAction postAction = new DSSPostAction(); - postAction.addRequestPayload("id", Integer.parseInt(getWidgetId(requestRef.getRefJobContent()))); - postAction.addRequestPayload(CSCommonUtils.CONTEXT_ID_STR, contextId); - postAction.setUser(requestRef.getUserName()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - postAction.addRequestPayload("labels", routeVO); - - return VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, postAction); - } - - private ResponseRef updateCsRef(DSSJobContentRequestRef requestRef, Long widgetId, - String contextId) throws ExternalOperationFailedException { - String url = baseUrl + URLUtils.widgetContextUrl; - DSSPostAction postAction = new DSSPostAction(); - postAction.addRequestPayload("id", widgetId); - postAction.addRequestPayload(CSCommonUtils.CONTEXT_ID_STR, contextId); - postAction.setUser(requestRef.getUserName()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(requestRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - postAction.addRequestPayload("labels", routeVO); - - return VisualisCommonUtil.getExternalResponseRef(requestRef, ssoRequestOperation, url, postAction); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectCreationOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectCreationOperation.java deleted file mode 100644 index 44a19f2f0..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectCreationOperation.java +++ /dev/null @@ -1,61 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.project; - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.app.structure.AbstractStructureOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectCreationOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ref.DSSProjectContentRequestRef; -import com.webank.wedatasphere.dss.standard.app.structure.project.ref.ProjectResponseRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import org.apache.linkis.server.conf.ServerConfiguration; - -public class VisualisProjectCreationOperation extends AbstractStructureOperation - implements ProjectCreationOperation { - - private final static String projectUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/projects"; - - @Override - public ProjectResponseRef createProject(DSSProjectContentRequestRef.DSSProjectContentRequestRefImpl projectRef) throws ExternalOperationFailedException { - String url = getBaseUrl() + projectUrl; - DSSPostAction visualisPostAction = new DSSPostAction(); - visualisPostAction.setUser(projectRef.getDSSProject().getCreateBy()); - visualisPostAction.addRequestPayload("name", projectRef.getDSSProject().getName()); - visualisPostAction.addRequestPayload("description", projectRef.getDSSProject().getDescription()); - visualisPostAction.addRequestPayload("pic", "6"); - visualisPostAction.addRequestPayload("visibility", true); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(projectRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - visualisPostAction.addRequestPayload("labels", routeVO); - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(projectRef, ssoRequestOperation, url, visualisPostAction); - @SuppressWarnings("unchecked") - Long projectId = DSSCommonUtils.parseToLong(responseRef.getValue("id")); - return ProjectResponseRef.newExternalBuilder() - .setRefProjectId(projectId).success(); - } - - @Override - protected String getAppConnName() { - return VisualisAppConn.VISUALIS_APPCONN_NAME; - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectDeletionOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectDeletionOperation.java deleted file mode 100644 index f1aade374..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectDeletionOperation.java +++ /dev/null @@ -1,33 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.project; - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPostAction; -import com.webank.wedatasphere.dss.standard.app.structure.AbstractStructureOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectDeletionOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ref.RefProjectContentRequestRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -public class VisualisProjectDeletionOperation extends AbstractStructureOperation - implements ProjectDeletionOperation { - - @Override - protected String getAppConnName() { - return VisualisAppConn.VISUALIS_APPCONN_NAME; - } - - @Override - public ResponseRef deleteProject(RefProjectContentRequestRef.RefProjectContentRequestRefImpl projectRef) - throws ExternalOperationFailedException { - String url = URLUtils.getUrl(getBaseUrl(), URLUtils.PROJECT_DELETE_UPDATE_URL, projectRef.getRefProjectId().toString()); - DSSPostAction deleteAction = new DSSPostAction(); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(projectRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - deleteAction.addRequestPayload("labels", routeVO); - deleteAction.setUser(projectRef.getUserName()); - return VisualisCommonUtil.getExternalResponseRef(projectRef, ssoRequestOperation, url, deleteAction); - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectSearchOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectSearchOperation.java deleted file mode 100644 index ea3788914..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectSearchOperation.java +++ /dev/null @@ -1,34 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.project; - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; -import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSGetAction; -import com.webank.wedatasphere.dss.standard.app.structure.AbstractStructureOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectSearchOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ref.ProjectResponseRef; -import com.webank.wedatasphere.dss.standard.app.structure.project.ref.RefProjectContentRequestRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -public class VisualisProjectSearchOperation extends AbstractStructureOperation - implements ProjectSearchOperation { - - @Override - public ProjectResponseRef searchProject(RefProjectContentRequestRef.RefProjectContentRequestRefImpl projectRef) throws ExternalOperationFailedException { - String url = getBaseUrl() + URLUtils.PROJECT_SEARCH_URL; - DSSGetAction visualisGetAction = new DSSGetAction(); - visualisGetAction.setUser(projectRef.getUserName()); - visualisGetAction.setParameter("keywords", projectRef.getProjectName()); - visualisGetAction.setParameter("labels", projectRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - ResponseRef responseRef = VisualisCommonUtil.getExternalResponseRef(projectRef, ssoRequestOperation, url, visualisGetAction); - return ProjectResponseRef.newExternalBuilder().setRefProjectId(DSSCommonUtils.parseToLong(responseRef.toMap().get("id"))).success(); - } - - @Override - protected String getAppConnName() { - return VisualisAppConn.VISUALIS_APPCONN_NAME; - } -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectService.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectService.java deleted file mode 100644 index 5f3eb718f..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectService.java +++ /dev/null @@ -1,56 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.project; - -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectDeletionOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectSearchOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectUpdateOperation; - -public class VisualisProjectService extends ProjectService { - - @Override - public boolean isCooperationSupported() { - return false; - } - - @Override - public boolean isProjectNameUnique() { - return true; - } - - @Override - protected VisualisProjectCreationOperation createProjectCreationOperation() { - return new VisualisProjectCreationOperation(); - } - - @Override - protected ProjectUpdateOperation createProjectUpdateOperation() { - return new VisualisProjectUpdateOperation(); - } - - @Override - protected ProjectDeletionOperation createProjectDeletionOperation() { - return new VisualisProjectDeletionOperation(); - } - - @Override - protected ProjectSearchOperation createProjectSearchOperation() { - return new VisualisProjectSearchOperation(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectUpdateOperation.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectUpdateOperation.java deleted file mode 100644 index 961c5570d..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectUpdateOperation.java +++ /dev/null @@ -1,33 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.project; - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; -import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisCommonUtil; -import com.webank.wedatasphere.dss.common.label.LabelRouteVO; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSPutAction; -import com.webank.wedatasphere.dss.standard.app.structure.AbstractStructureOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectUpdateOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ref.ProjectUpdateRequestRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; - -public class VisualisProjectUpdateOperation extends AbstractStructureOperation - implements ProjectUpdateOperation { - @Override - protected String getAppConnName() { - return VisualisAppConn.VISUALIS_APPCONN_NAME; - } - - @Override - public ResponseRef updateProject(ProjectUpdateRequestRef.ProjectUpdateRequestRefImpl projectRef) throws ExternalOperationFailedException { - String url = URLUtils.getUrl(getBaseUrl(), URLUtils.PROJECT_DELETE_UPDATE_URL, projectRef.getRefProjectId().toString()); - DSSPutAction updateAction = new DSSPutAction(); - updateAction.addRequestPayload("description", projectRef.getDSSProject().getDescription()); - LabelRouteVO routeVO = new LabelRouteVO(); - routeVO.setRoute(projectRef.getDSSLabels().get(0).getValue().get("DSSEnv")); - updateAction.addRequestPayload("labels", routeVO); - updateAction.setUser(projectRef.getUserName()); - return VisualisCommonUtil.getExternalResponseRef(projectRef, ssoRequestOperation, url, updateAction); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisResponseRefBuilder.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisResponseRefBuilder.java deleted file mode 100644 index 0833a3367..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisResponseRefBuilder.java +++ /dev/null @@ -1,36 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.ref; - -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRefBuilder; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRefImpl; - -import java.util.Map; - -import static com.webank.wedatasphere.dss.appconn.visualis.utils.NumberUtils.getInt; - -/** - * @author enjoyyin - * @date 2022-03-07 - * @since 0.5.0 - */ -public class VisualisResponseRefBuilder - extends ResponseRefBuilder.ExternalResponseRefBuilder { - - @Override - public VisualisResponseRefBuilder setResponseBody(String responseBody) { - super.setResponseBody(responseBody).build(); - Map headerMap = (Map) responseMap.get("header"); - if (headerMap.containsKey("code")) { - status = getInt(headerMap.get("code")); - if (status != 0 && status != 200) { - errorMsg = headerMap.get("msg").toString(); - } - } - Object payload = responseMap.get("payload"); - if(payload instanceof Map) { - setResponseMap((Map) payload); - } - return this; -// return super.setResponseBody(responseBody); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisCRUDService.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisCRUDService.java deleted file mode 100644 index 9bcb5670b..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisCRUDService.java +++ /dev/null @@ -1,47 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.service; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefCopyOperation; -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefCreationOperation; -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefDeletionOperation; -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefUpdateOperation; -import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefCRUDService; - -public class VisualisCRUDService extends AbstractRefCRUDService { - - @Override - protected VisualisRefCreationOperation createRefCreationOperation() { - return new VisualisRefCreationOperation(); - } - - @Override - protected VisualisRefCopyOperation createRefCopyOperation() { - return new VisualisRefCopyOperation(); - } - - @Override - protected VisualisRefUpdateOperation createRefUpdateOperation() { - return new VisualisRefUpdateOperation(); - } - - @Override - protected VisualisRefDeletionOperation createRefDeletionOperation() { - return new VisualisRefDeletionOperation(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisExecutionService.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisExecutionService.java deleted file mode 100644 index f93f300ed..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisExecutionService.java +++ /dev/null @@ -1,29 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.service; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefExecutionOperation; -import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExecutionService; - -public class VisualisExecutionService extends AbstractRefExecutionService { - - @Override - protected VisualisRefExecutionOperation createRefExecutionOperation() { - return new VisualisRefExecutionOperation(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefExportService.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefExportService.java deleted file mode 100644 index 5f482448b..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefExportService.java +++ /dev/null @@ -1,29 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.service; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefExportOperation; -import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExportService; - -public class VisualisRefExportService extends AbstractRefExportService { - - @Override - public VisualisRefExportOperation createRefExportOperation() { - return new VisualisRefExportOperation(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefImportService.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefImportService.java deleted file mode 100644 index c8439eb2a..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefImportService.java +++ /dev/null @@ -1,29 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.service; - -import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefImportOperation; -import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefImportService; - -public class VisualisRefImportService extends AbstractRefImportService { - - @Override - protected VisualisRefImportOperation createRefImportOperation() { - return new VisualisRefImportOperation(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/NumberUtils.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/NumberUtils.java deleted file mode 100644 index e14ba8758..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/NumberUtils.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.utils; - -import org.apache.commons.lang.StringUtils; - -import java.util.regex.Matcher; -import java.util.regex.Pattern; - -public class NumberUtils { - - public static Integer getInt(Object original) { - if (original instanceof Double) { - return ((Double) original).intValue(); - } - return (Integer) original; - } - - public static String parseDoubleString(String doubleString) { - if (isDouble(doubleString)) { - Double doubleValue = Double.parseDouble(doubleString); - Integer intValue = doubleValue.intValue(); - return intValue.toString(); - } - return doubleString; - - } - - /** - * 判断字符串是不是double型 - * - * @param str - * @return - */ - public static boolean isDouble(String str) { - if (StringUtils.isEmpty(str)) { - return false; - } - Pattern pattern = Pattern.compile("[0-9]+[.]?[0-9]*[dD]?"); - Matcher isNum = pattern.matcher(str); - if (!isNum.matches()) { - return false; - } - return true; - } - - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/URLUtils.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/URLUtils.java deleted file mode 100644 index f7800087f..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/URLUtils.java +++ /dev/null @@ -1,67 +0,0 @@ -/* - * Copyright 2019 WeBank - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appconn.visualis.utils; - -import org.apache.linkis.server.conf.ServerConfiguration; - -public class URLUtils { - public final static String widgetUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget" + "/smartcreate"; - public final static String widgetUpdateUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget" + "/rename"; - public final static String widgetContextUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget" + "/setcontext"; - public final static String widgetDeleteUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widgets"; - public final static String displayUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/displays"; - public final static String VIEW_URL = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/views"; - - public final static String dashboardPortalUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/dashboardPortals"; - public final static String displaySlideConfig = "{\"slideParams\":{\"width\":1920,\"height\":1080,\"backgroundColor\":[255,255,255],\"scaleMode\":\"noScale\",\"backgroundImage\":null}}"; - public final static String projectUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/project"; - public final static String PROJECT_COPY_URL = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/project/copy"; - - public final static String DISPLAY_PREVIEW_URL_FORMAT = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/displays/%s/preview"; - public final static String DASHBOARD_PREVIEW_URL_FORMAT = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/dashboard/portal/%s/preview"; - public final static String WIDGET_DATA_URL_FORMAT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget/%s/getdata"; - public final static String VIEW_DATA_URL_FORMAT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/view/%s/getdata"; - public final static String VIEW_DATA_URL_SUBMIT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/view/%s/async/submit"; - public final static String VIEW_DATA_URL_IS__HIVE_DATA_SOURCE = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/view/%s/type/source"; - public final static String VIEW_DATA_URL_STATE = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/views/%s/getprogress"; - public final static String VIEW_DATA_URL_ASYNC_RESULT = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/views/%s/getresult"; - public final static String DISPLAY_METADATA_URL_FORMAT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget/display/%s/metadata"; - public final static String DASHBOARD_METADATA_URL_FORMAT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget/portal/%s/metadata"; - - public final static String WIDGET_JUMP_URL_FORMAT = "dss/visualis/#/project/%s/widget/%s"; - public final static String DISPLAY_JUMP_URL_FORMAT = "dss/visualis/#/project/%s/display/%s"; - public final static String DASHBOARD_JUMP_URL_FORMAT = "dss/visualis/#/project/%s/portal/%s/portalName/%s"; - public final static String VIEW_JUMP_URL_FORMAT = "dss/visualis/#/project/%s/view/%s"; - //工程搜索地址 - public final static String PROJECT_SEARCH_URL = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/check/projectName"; - // 工程删除和更新URL - public final static String PROJECT_DELETE_UPDATE_URL = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/projects/%s"; - - - public static String getUrl(String baseUrl, String format, String entityId) { - return baseUrl + String.format(format, entityId); - } - - public static String getUrl(String baseUrl, String format, String... ids) { - return baseUrl + String.format(format, ids); - } - - public static String getEnvUrl(String url, String labels) { - return url + "?env=" + labels.toLowerCase(); - } - -} diff --git a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisCommonUtil.java b/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisCommonUtil.java deleted file mode 100644 index c48850d65..000000000 --- a/visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisCommonUtil.java +++ /dev/null @@ -1,101 +0,0 @@ -package com.webank.wedatasphere.dss.appconn.visualis.utils; - -import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; -import com.webank.wedatasphere.dss.appconn.visualis.ref.VisualisResponseRefBuilder; -import com.webank.wedatasphere.dss.standard.app.development.ref.ExportResponseRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; -import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; -import com.webank.wedatasphere.dss.standard.app.sso.origin.request.action.DSSHttpAction; -import com.webank.wedatasphere.dss.standard.app.sso.ref.WorkspaceRequestRef; -import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; -import com.webank.wedatasphere.dss.standard.common.entity.ref.InternalResponseRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import com.webank.wedatasphere.dss.standard.sso.utils.SSOHelper; -import org.apache.linkis.httpclient.request.HttpAction; -import org.apache.linkis.httpclient.response.HttpResult; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.HashMap; -import java.util.Map; - -public class VisualisCommonUtil { - - private final static Logger logger = LoggerFactory.getLogger(VisualisCommonUtil.class); - - public static SSOUrlBuilderOperation getSSOUrlBuilderOperation(WorkspaceRequestRef requestRef, String url) { - SSOUrlBuilderOperation ssoUrlBuilderOperation = SSOHelper.createSSOUrlBuilderOperation(requestRef.getWorkspace()); - ssoUrlBuilderOperation.setAppName(VisualisAppConn.VISUALIS_APPCONN_NAME); - ssoUrlBuilderOperation.setReqUrl(url); - return ssoUrlBuilderOperation; - } - - public static HttpResult getHttpResult(WorkspaceRequestRef requestRef, - SSORequestOperation ssoRequestOperation, - String url, - DSSHttpAction visualisHttpAction) throws ExternalOperationFailedException { - - try { - SSOUrlBuilderOperation ssoUrlBuilderOperation = getSSOUrlBuilderOperation(requestRef, url); - visualisHttpAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); - return ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisHttpAction); - } catch (Exception e) { - throw new ExternalOperationFailedException(90177, "Create visualis node Exception", e); - } - } - - public static InternalResponseRef getInternalResponseRef(WorkspaceRequestRef requestRef, - SSORequestOperation ssoRequestOperation, - String url, - DSSHttpAction visualisHttpAction) throws ExternalOperationFailedException { - HttpResult httpResult = getHttpResult(requestRef, ssoRequestOperation, url, visualisHttpAction); - InternalResponseRef responseRef = ResponseRef.newInternalBuilder().setResponseBody(httpResult.getResponseBody()).build(); - checkResponseRef(responseRef); - return responseRef; - } - - public static RefJobContentResponseRef getRefJobContentResponseRef(WorkspaceRequestRef requestRef, - SSORequestOperation ssoRequestOperation, - String url, - DSSHttpAction visualisHttpAction) throws ExternalOperationFailedException { - ResponseRef responseRef = getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisHttpAction); - return RefJobContentResponseRef.newBuilder().setRefJobContent(responseRef.toMap()).success(); - } - - public static ExportResponseRef getExportResponseRef(WorkspaceRequestRef requestRef, - SSORequestOperation ssoRequestOperation, - String url, - DSSHttpAction visualisHttpAction) throws ExternalOperationFailedException { - ResponseRef responseRef = getExternalResponseRef(requestRef, ssoRequestOperation, url, visualisHttpAction); - return ExportResponseRef.newBuilder().setResourceMap(responseRef.toMap()).success(); - } - - public static ResponseRef getExternalResponseRef(WorkspaceRequestRef requestRef, - SSORequestOperation ssoRequestOperation, - String url, - DSSHttpAction visualisHttpAction) throws ExternalOperationFailedException { - HttpResult httpResult = getHttpResult(requestRef, ssoRequestOperation, url, visualisHttpAction); - logger.info("responsebody from visualis:{}", httpResult.getResponseBody()); - ResponseRef responseRef = new VisualisResponseRefBuilder().setResponseBody(httpResult.getResponseBody()).build(); - checkResponseRef(responseRef); - return responseRef; - } - - public static void checkResponseRef(ResponseRef responseRef) throws ExternalOperationFailedException { - if (responseRef.getStatus() != 0 && responseRef.getStatus() != 200) { - logger.error(responseRef.getResponseBody()); - throw new ExternalOperationFailedException(90177, responseRef.getErrorMsg(), null); - } - } - - - public static Long getNodeId(Map jobContent, String nodeName) { - Object nodeIdObj = jobContent.get(nodeName); - if (nodeIdObj == null) { - return null; - } else { - return Long.parseLong(NumberUtils.parseDoubleString(nodeIdObj.toString())); - } - } -} diff --git a/visualis-appconn/src/main/resources/init.sql b/visualis-appconn/src/main/resources/init.sql deleted file mode 100644 index 59223d655..000000000 --- a/visualis-appconn/src/main/resources/init.sql +++ /dev/null @@ -1,85 +0,0 @@ --- TODO 这里只适用于第一次安装时。如果是更新的话dss_appconn表不能先删除再插入,因为其他表如dss_workspace_appconn_role关联了appconn_id(不能变),需要使用update、alter语句更新 -select @visualis_appconnId:=id from `dss_appconn` where `appconn_name` = 'visualis'; -delete from `dss_appconn_instance` where `appconn_id` = @visualis_appconnId; - -INSERT INTO `dss_appconn` (`appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) -VALUES ('visualis', 0, 1, 1, 1, NULL, 'com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn', 'DSS_INSTALL_HOME_VAL/dss-appconns/visualis', ''); - -select @visualis_appconnId:=id from `dss_appconn` where `appconn_name` = 'visualis'; - -delete from `dss_appconn_instance` where `homepage_uri` like '%visualis%'; -INSERT INTO `dss_appconn_instance` (`appconn_id`, `label`, `url`, `enhance_json`, `homepage_uri`) -VALUES (@visualis_appconnId, 'DEV', 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/', '', 'dss/visualis/#/projects'); - --- 查看"数据分析"分组ID值 -select @visualis_menuId:=id from dss_workspace_menu where name = "数据分析"; - -delete from `dss_workspace_menu_appconn` WHERE title_en='Visualis'; -INSERT INTO `dss_workspace_menu_appconn` (`appconn_id`, `menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) - VALUES(@visualis_appconnId,@visualis_menuId,'Visualis','Visualis','Visualis is a data visualization BI tool based on Davinci, with Linkis as the kernel, it supports the analysis mode of data development exploration.','Visualis是基于宜信开源项目Davinci开发的数据可视化BI工具,以任意桥(Linkis)做为内核,支持拖拽式报表定义、图表联动、钻取、全局筛选、多维分析、实时查询等数据开发探索的分析模式,并做了水印、数据质量校验等金融级增强。' - ,'visualization, statement','可视化,报表','1','enter Visualis','进入Visualis','user manual','用户手册','/manual_url','shujukeshihua-logo',NULL,NULL,NULL,NULL,NULL,'shujukeshihua-icon'); - -delete from `dss_workflow_node` where `appconn_name` = 'visualis'; -insert into `dss_workflow_node` (`name`, `appconn_name`, `node_type`, `jump_type`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon_path`) -values('display','visualis','linkis.appconn.visualis.display',1,'1','1','0','1','icons/display.icon'); -insert into `dss_workflow_node` (`name`, `appconn_name`, `node_type`, `jump_type`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon_path`) -values('dashboard','visualis','linkis.appconn.visualis.dashboard',1,'1','1','0','1','icons/dashboard.icon'); -insert into `dss_workflow_node` (`name`, `appconn_name`, `node_type`, `jump_type`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon_path`) -values('widget','visualis','linkis.appconn.visualis.widget',1,'1','1','0','1','icons/widget.icon'); -insert into `dss_workflow_node` (`name`, `appconn_name`, `node_type`, `jump_type`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon_path`) -values('view','visualis','linkis.appconn.visualis.view',1,'1','1','0','1','icons/view.icon'); - -select @dss_visualis_displayId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.display'; -select @dss_visualis_dashboardId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.dashboard'; -select @dss_visualis_widgetId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.widget'; -select @dss_visualis_viewId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.view'; - -delete from `dss_workflow_node_to_group` where `node_id`=@dss_visualis_displayId; -delete from `dss_workflow_node_to_group` where `node_id`=@dss_visualis_dashboardId; -delete from `dss_workflow_node_to_group` where `node_id`=@dss_visualis_widgetId; -delete from `dss_workflow_node_to_group` where `node_id`=@dss_visualis_viewId; - -delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_visualis_displayId; -delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_visualis_dashboardId; -delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_visualis_widgetId; -delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_visualis_viewId; - --- 查找'数据可视化'工作流节点类型组的id -select @visualis_node_groupId:=id from `dss_workflow_node_group` where `name` = '数据可视化'; - -INSERT INTO `dss_workflow_node_to_group`(`node_id`,`group_id`) values (@dss_visualis_displayId, @visualis_node_groupId); -INSERT INTO `dss_workflow_node_to_group`(`node_id`,`group_id`) values (@dss_visualis_dashboardId, @visualis_node_groupId); -INSERT INTO `dss_workflow_node_to_group`(`node_id`,`group_id`) values (@dss_visualis_widgetId, @visualis_node_groupId); -INSERT INTO `dss_workflow_node_to_group`(`node_id`,`group_id`) values (@dss_visualis_viewId, @visualis_node_groupId); - --- 表中有的是重复记录,最好加上limit 1 -select @visualis_node_ui_label_name_1:=id from `dss_workflow_node_ui` where `lable_name` = '节点名' limit 1; -select @visualis_node_ui_label_name_2:=id from `dss_workflow_node_ui` where `lable_name` = '节点描述' limit 1; -select @visualis_node_ui_label_name_3:=id from `dss_workflow_node_ui` where `lable_name` = '业务标签' limit 1; -select @visualis_node_ui_label_name_4:=id from `dss_workflow_node_ui` where `lable_name` = '应用标签' limit 1; -select @visualis_node_ui_label_name_5:=id from `dss_workflow_node_ui` where `lable_name` = '是否复用引擎' limit 1; -select @visualis_node_ui_label_name_6:=id from `dss_workflow_node_ui` where `lable_name` = '绑定上游节点' limit 1; - -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId, @visualis_node_ui_label_name_1); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId, @visualis_node_ui_label_name_2); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId, @visualis_node_ui_label_name_3); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId, @visualis_node_ui_label_name_4); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId, @visualis_node_ui_label_name_5); - -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId, @visualis_node_ui_label_name_1); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId, @visualis_node_ui_label_name_2); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId, @visualis_node_ui_label_name_3); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId, @visualis_node_ui_label_name_5); - -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_widgetId, @visualis_node_ui_label_name_1); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_widgetId, @visualis_node_ui_label_name_2); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_widgetId, @visualis_node_ui_label_name_3); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_widgetId, @visualis_node_ui_label_name_4); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_widgetId, @visualis_node_ui_label_name_5); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_widgetId, @visualis_node_ui_label_name_6); - -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_viewId, @visualis_node_ui_label_name_1); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_viewId, @visualis_node_ui_label_name_2); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_viewId, @visualis_node_ui_label_name_3); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_viewId, @visualis_node_ui_label_name_4); -INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_viewId, @visualis_node_ui_label_name_5); \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_Davinci_difference_en.md b/visualis_docs/en_US/Visualis_Davinci_difference_en.md deleted file mode 100644 index 30cb7e299..000000000 --- a/visualis_docs/en_US/Visualis_Davinci_difference_en.md +++ /dev/null @@ -1,46 +0,0 @@ -> Functional differences between visualis and DaVinci - -## 1. Custom variable format - -DaVinci's custom variable format is $variablename$by default, and supports modifying the default format in the configuration. In visualis, variables are all in the ${variablename} format and cannot be modified. This format is consistent with the custom variables of linkis. For example: - -````sql -select * from students where class = ${className} -```` - -## 2. Organization and permission functions - -    When visualis is used as an embedded module of DSS, the organization and permission functions are removed. If you need to use the organization and permission functions consistent with DaVinci separately, you can access visualis on a separate page in the form of the following URL parameters. -````url -http://ip:port/dws/visualis/#/projects?withHeader=true -```` - -## 3. Regular mail sending function - -    In the workflow of datasphere studio, the sendmail node is provided to support sending the dashboard and display in visualis as mail content. -    DaVinci's original mail scheduled task function remains unchanged in visualis. - - -## 4.Preview function of dashboard and display - -    In order to verify the pictures actually sent by the mail, visualis changes the page to which the preview button on the dashboard/display editing interface jumps to display the actual screenshot of the dashboard/display. - - -## 5. User management and login - -    DaVinci's native login and user management methods are no longer supported. Visualis shares a user session with datasphere studio. After logging in from the DSS login page, you can seamlessly jump to visualis. -    At the database level, the users of visualis are changed from linkis_ Read from the user table. - -## 6. project - -    Unlike DaVinci, visualis projects can have no organization and allow projects that belong only to individuals to exist. -    The visualis project is fully synchronized with the DSS project. At the database level,read from the Visualis_project table. - -## 7. SQL split commit - -    When executing SQL through JDBC in DaVinci, if a view contains multiple SQL statements, these statements will be separated in order, and only one statement will be submitted for execution at a time. -    In visualis, the logic executed through JDBC remains unchanged. However, when spark SQL is submitted through linkis to query hive data sources, in order to ensure that the SQL of the same view is submitted to the same engine for execution, the SQL statements will not be separated in visualis, that is, the statements in each view will be submitted to linkis together. After being allocated to a specific engine for execution, the engine will execute them separately in order. - -## 8. Connect with DSS workflow -    DaVinci does not support workflow scheduling. -    DSS supports drag and drop development of visualisation visual reports, and supports coordination with DSS data development nodes for widget, display, and dashboard node development. And you can publish the execution schedule with one click and send mail. diff --git a/visualis_docs/en_US/Visualis_appconn_install_en.md b/visualis_docs/en_US/Visualis_appconn_install_en.md deleted file mode 100644 index 6727acf4e..000000000 --- a/visualis_docs/en_US/Visualis_appconn_install_en.md +++ /dev/null @@ -1,49 +0,0 @@ -> Visualis AppConn Installation - -## 1. AppConn installation -    The third-party component AppConn of DSS1.1.0 is maintained by the third-party component itself, so in order to successfully install visualis and support the DSS workflow, you need to pull the visualis1.0.0 code, compile and package the AppConn code. -```shell -# Enter the visualis source code project -cd visualis - -# Enter the visualis-appconn module -cd visualis-appconn - -mvn clean package -DskipTests=ture -``` -    The visualis.zip package as shown below is the package of visualis-appconn. -![](./../images/visualis_appconn.jpg) -    If you use [DSS one-click installation of the whole family bucket](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/1.1.0/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/DSS%26Linkis%E4%B8%80%E9%94%AE%E9%83%A8%E7%BD%B2%E6%96%87%E6%A1%A3%E5%8D%95%E6%9C%BA%E7%89%88.md) to deploy the service, you can directly use the script tool provided in its software package. After the one-click family bucket deployment is complete, you can find the script tool in the dss installation directory. Its directory structure and usage instructions are as follows. -```shell -# Go to the bin directory of the dss installation -> cd dss/bin - -# Where appconn-install.sh is the AppConn installation script tool ->> ls ->> appconn-install.sh appconn-refresh.sh checkEnv.sh executeSQL.sh install.sh start-default-appconn.sh -```` -    In order to install smoothly, the Visualis service needs to be deployed first, and then the zip package of visualis appconn needs to be placed in the specified appconn directory and decompressed. For the installation and deployment of Visualis, please refer to [Visualis Installation and Deployment Documentation](./Visualis_deploy_doc_cn.md). The steps for placing the visualis appconn zip package and the AppConn installation script tool are as follows: -```shell -# Put visualis appconn in the dss-appconns directory -rz -ybe ${DSS_INSTALL_HOME}/dss/dss-appconns - -# Unzip the Visualis AppConn package -unzip visualis.zip - -cd {DSS_INSTALL_HOME}/dss/bin - -> sh appconn-install.sh - -# Enter the Visualis name ->> visualis - -# Enter the Visualis frontend IP address ->> 127.0.0.1 - -# After entering the front-end port of the Visualis service ->> 8088 - -# After executing the AppConn installation script tool, the configuration information of the relevant third-party AppConn will be inserted -```` -    DSS service needs to be restarted after modification. -    If you use the domain name to access the DSS service, you need to refer to Section 5 of the [visualis installation and deployment document](./Visualis_deploy_doc_cn.md). \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_deploy_doc_en.md b/visualis_docs/en_US/Visualis_deploy_doc_en.md index f02524bf3..a8216ec39 100644 --- a/visualis_docs/en_US/Visualis_deploy_doc_en.md +++ b/visualis_docs/en_US/Visualis_deploy_doc_en.md @@ -1,290 +1,84 @@ -Visualis compile and deploy documentation ------- +> How to deploy Visualis -# 1. Environment preparation and compilation +## 1. Get installation package and deploy -## 1.1. Dependency environment preparation -| Dependent components | Whether it must be installed | Install through train | -| -------------- | ------ | --------------- | -| MySQL (5.5+) | 必装 | [how to install mysql](https://www.runoob.com/mysql/mysql-install.html) | -| JDK (1.8.0_141) | 必装 | [how to install mysql JDK](https://www.runoob.com/java/java-environment-setup.html) | -| Hadoop(2.7.2,Hadoop 其他版本需自行编译 Linkis) | 必装 | [how to install mysql Hadoop](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) ;[how to install mysql Hadoop](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| Spark(2.4.3,Spark 其他版本需自行编译 Linkis) | 必装 | [how to install mysql Spark](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| DSS1.1.0 | 必装 | [how to install mysql DSS](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/1.1.0/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/DSS%26Linkis%E4%B8%80%E9%94%AE%E9%83%A8%E7%BD%B2%E6%96%87%E6%A1%A3%E5%8D%95%E6%9C%BA%E7%89%88.md) | -| Linkis1.1.1(大于等于该版本) | 必装 | [how to install mysql Linkis](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| Nginx | 必装 | [how to install mysql Nginx](http://nginx.org/en/linux_packages.html) | +    Get the latest installation package from our Github releases, then: -## 1.2. Create a Linux user - -    Please keep the deployment user of Visualis consistent with the deployment user of Linkis, and use hadoop user deployment. - -## 1.3. Low-level dependency component checking - -    **After installing linkis, please ensure that DSS1.1.0 and Linkis1.1.1 are basically available, you can execute SparkQL scripts on the DSS front-end interface, and you can create and execute DSS workflows normally.** - -## 1.4. Download the source package and compile the backend - -    When installing the Visualis source code, you need to download the corresponding source code package for compilation. At present, the Linkis1.1.1 version that Visualis depends on has been uploaded to the Maven central warehouse. As long as the Maven configuration is normal, the relevant dependencies can be pulled. **DSS 1.1.0 version is being released and has not been uploaded to the Maven central repository. You need to pull 1.1.0 of the DSS repository for compilation, and install the dependencies locally.** - -```shell -# 1. Download the source code -git clone https://github.com/WeBankFinTech/Visualis.git - -# 2. Switch to the 1.0.0 branch -git checkout 1.0.0 - -# 3. Execute compilation and packaging -cd Visualis -mvn -N install -mvn clean package -DskipTests=true -```` - -## 1.5. Compile the frontend -    Visualis is a front-end and back-end separation project. Front-end files can be compiled and packaged separately. You need to install npm tools on your computer. You can view [npm installation](https://nodejs.org/en/download/ ), on the windows machine, you can open the Terminal interface of the Idea tool, or use Git bash to complete the front-end compilation. -```shell -# Check if npm is installed -npm -v ->> 8.1.0 - -cd webapp # Enter the front-end file path -npm i # download front-end dependencies -npm run build # Compile front-end packages - -# A build file directory will be generated in the webapp directory, which is the compiled front-end package file - -# In the windows environment, compress the build directory into a zip file -```` - -## 2. Install and deploy -## 2.1. Install the backend -    Visualis uses assembly as a packaging plug-in. After compiling, go to the Visualis/assembly/target directory to find the compiled visualis-server.zip package. ````bash -# 1. Unzip the installation package -unzip visualis-server.zip -cd visualis-server + ## 1. Unzip the installation package +unzip visualis-assembly-0.5.0-dist-beta.7.zip +cd visualis-assembly-0.5.0-dist-beta.7 ```` -    After decompressing the visualis compilation package, enter the directory and you can see the following file directory. -```` -visualis-server - --- bin # Service start and stop script - --- conf # Service configuration directory - --- davinvi-ui # Front-end template, presence or absence does not affect use - --- lib # Service jar package storage location - --- logs # log directory -```` -    On the server to be deployed (or the server deployed by DSS), upload the visualis-server.zip package, and decompress it on the path to be deployed to complete the Visualis installation. - -## 2.2. Initialize the database -    The compilation package of Visualis is installed by decompression, and the related SQL files are not executed. Therefore, in the normal installation steps, you need to create a visualis database and execute the visualis related table building statement. -    Relevant table building statements can be found in the source code, enter the root directory of the source code, find the db folder, connect to the corresponding database, execute the following SQL file, and create the required use of Visualis surface. -```shell -# Find the corresponding sql file in the source package db directory - -# Connect to the visualis database -mysql -h 127.0.0.1 -u hadoop -d visualis -P3306 -p - -source ${visualis_home}/davinci.sql -source ${visualis_home}/ddl.sql - -# Where davinci.sql is the davinci table that visualis needs to use -# ddl.sql is a table that visualis additionally depends on -```` - - -## 2.3. Font library installation -    For mail reports, Chinese fonts need to be rendered, and the Visualis screenshot function depends on Chinese fonts, which are located in the /usr/share/fonts directory on the deployed machine. Create a new visualis folder, upload **pf.ttf in the ext directory of the Visualis source package to the visualis folder**, and execute the fc-cache –fv command to refresh the font cache. -```shell -# Need to switch to root user -sudo su -cd /usr/share/fonts -mkdir visualis - -# Upload pf.ttf Chinese font library -rz -ybe - -# Refresh font library cache -fc-cache –fv -```` -    When using visualis, when calling the preview function or executing Display and Dashboard in the workflow, if an error is reported: **error while loading shared libraries: libfontconfig.so.1: cannot open shared object file : No such file or directory**, an error is reported due to the lack of dependencies on the machine where visualis is deployed. Execute **sudo yum -y install fontconfig-devel** to install the dependencies. - - -## 2.4 Install the front end -    In order to better explain the front-end configuration, first give the configuration of nginx, the front-end configuration and description of nginx of visualis: -```shell -server { - - listen 8088;# a. access port - server_name localhost; +## 2. Modify configurations - location /dss/linkis { # b. static file directory of the linkis console - root /data/dss_linkis/web; - autoindex on; - } - - location /dss/visualis { # c. Front-end access path, which needs to be created manually - root /data/dss_linkis/web; # d. Visualis front-end static resource file directory, which can be freely specified - autoindex off; - } +     After installation directories ready, follow below steps to modify configurations. (Basically application.yml and linkis.properties under conf directory) - location / { # e.dss static file directory - root /data/dss_linkis/web/dist; - index index.html index.html; - } +### 2.1 Modify application.yml - location /ws { - proxy_pass http://127.0.0.1:9001; # f. linkis gateway address - # ... - } - - location /api { - proxy_pass http://127.0.0.1:9001; # g. linkis gateway address - # ... - } -} -``` -**The above configuration c and d small items.** -```shell -# Configure the root path of static resources (used to configure the root parameter of nginx, that is, the d item) -cd /data/dss_linkis/web - -# In the previous step /data/dss_linkis/web directory, configure the front-end access url path address (ie the c small item, if not, you need to create it) -cd dss/visualis - -# Upload Visualis front-end package -rz -ybe build.zip - -unzip build.zip # Unzip the front-end package - -cd build # Enter to the decompression path - -mv * ./../ # Move the static resource files to the c-item dss/visualis path -```` -    After the front-end deployment configuration, you can restart nginx or refresh the nginx configuration to make the above configuration take effect**sudo nginx -s reload.** - - -## 2.5. Modify configuration - -### 2.5.1. Modify application.yml -    In the configuration application.yml file, configuration items 1, 2, and 3 must be configured, and other configurations can use the default values. In item 1, you need to configure some deployment IP and port information , the second item needs to configure the information of eureka, and the third item only needs to configure the link information of the database.**(The library of visualis can be the same as the library of dss, or it can be different, the deployment user needs to choose by himself)**. -````yaml -# #################################### -# 1. Visualis Service configuration -# #################################### +```yaml server: protocol: http - address: 127.0.0.1 # server ip address (the IP of the machine where the service is deployed) - port: 8008 # server port (visualis service process port) - url: http://127.0.0.1:8088/dss/visualis # frontend index page full path (the full path of the frontend to access visualis) - access: - address: 127.0.0.1 # frontend address (front-end deployment IP) - port: 8088 # frontend port + address: #The IP address of the deployment machine + port: #The port of this service + url: #The full path to vist Visualis index page + access: + address: #The IP or host name of frontend address + port: #The port of frontend address - -# #################################### -# 2. eureka configuration -# #################################### eureka: client: serviceUrl: - defaultZone: http://127.0.0.1:20303/eureka/ # Configuration required - instance: - metadata-map: - test: wedatasphere -management: - endpoints: - web: - exposure: - include: refresh,info - + defaultZone: $EUREKA_URL #The eureka address -# #################################### -# 3. Spring configuration -# #################################### spring: - main: - allow-bean-definition-overriding: true application: - name: visualis-dev + name: visualis #Service name + ## davinci datasouce config datasource: - url: jdbc:mysql://127.0.0.1:3306/visualis?characterEncoding=UTF-8&allowMultiQueries=true # Configuration required - username: hadoop - password: hadoop + url: #The JDBC url of the application database + username: #The user name of the application database + password: #The password of the application database + +screenshot: + default_browser: PHANTOMJS # PHANTOMJS or CHROME + timeout_second: 1800 + phantomjs_path: ${DAVINCI3_HOME}/bin/phantomjs #selenium phantomjs Linux driver path(only need to be filled if PHANTOMJS is chosen for default_browser) + chromedriver_path: $your_chromedriver_path$ #selenium chrome Linux driver path(only need to be filled if CHROME is chosen for default_browser) +``` -# Keep other parameters as default, if you don't need customized modification, just use the default parameters -```` +### 2.2 Modify linkis.properties -### 2.5.2. Modify linkis.properties -````properties -# #################################### -# 1. need configuration -# need to configure -# #################################### -wds.linkis.gateway.url=http://127.0.0.1:9001 -# Others can use default parameters -# Omit configuration -```` -    **If the deployed hadoop cluster has Kerberos enabled, you need to enable Kerberos in the visualis configuration file linkis.properties file, and add the configuration items:** -````properties -wds.linkis.keytab.enable=true -```` +```properties + wds.dss.visualis.gateway.ip= #Linkis gateway ip + wds.dss.visualis.gateway.port= #Linkis gateway port +``` -## 3. Start the application +## 3. Initialize database +     Execute the statements of davinci.sql in the application database. This file can be got from both the release package or source code. -    After configuring and compiling the frontend package, you can try to start the service. Visualis is currently integrated with DSS and uses the DSS login and permission system. Before use, the DSS1.1.0 version needs to be deployed. You can refer to DSS1.1.0 one-click installation and deployment. +## 4. Start the application -### 3.1. Execute the startup script +     After modifying configurations, enter the bin directory and start the application. -    Enter the Visualis installation directory, find the bin folder, and execute the following command in this folder. -```` -sh ./start-server.sh -```` -Note: **If the newline character of the startup script cannot be recognized when the service is started, you need to convert the script on the server and use: dos2unix xxx.sh command to convert** +### 4.1 Execute the start script + +    Enter bin deirectory, execute +``` + ./start-server.sh +``` +### 4.1 Confirm that the application is successfully started + +    Open Eureka web page, if Visualis was found on the registered server list, it can be concluded that the application has been started successfully. If Visualis was not found after 3 minutes, please got to logs directory and open visualis.out to find error messages. + +## 5. Deploy frontend pages + +    Visualis frontend pages should be deployed separeted with backend services. Download the installation package and unzip to /dss/visualis directory under the path configured in nginx. -### 3.2. Confirm that the application starts successfully -    Open the Eureka page, find the instance of the visualis service in the list of registered services, and then consider the service to start successfully. At the same time, you can also view the service startup log of visualis. If no error is reported, the service starts successfully. -```` -# View service startup log -less logs/linkis.out -```` -    Check the Eureka page to see if the service is successfully registered. -![](./../images/visualis_eureka.png) -## 4. AppConn installation -    After the Visualis service is deployed, it needs to be connected with the DSS application store and workflow, and the corresponding AppConn needs to be installed on the DSS side. Please refer to [VisualisAppConn Installation](./Visualis_appconn_install_cn.md). -## 5. Visualis configuration instructions for domain name access to DSS (optional) -    In actual production, access to DSS generally uses a domain name for access. When readers read the visualis installation and deployment documents and appconn's deployment documents, they will find that there are several front-end configurations in the visualis configuration. These front-end configurations affect Preview function and mail report function. -    If you use a domain name, you need to pay attention to the following configuration: -1. When installing AppConn, when specifying the access ip and port of visualis appconn, you can write an analog value first. After the installation is complete, modify the url field of the dss_appconn_instance table to the domain name value, similar to: http://dss.bdp.com/ (note the trailing slash /, which cannot be omitted when configuring). -2. In the configuration file application.yml of the Visualis service, the specified front-end ip and port need to be specified as the ip of the front-end nginx server and the visualis port configured by nginx. -## 6. Log configuration (optional) -    In the actual usage scenario, depending on the linkis.out log output scenario is not in compliance with the specification, the log file is not rolled back, and long-term operation is likely to cause the production server disk capacity alarm, which will bring production problems , at present, we can optimize the log printing by modifying the log configuration. The log configuration can be modified as follows: -````properties - - - - - - - - - - - - - - - - - # Removing this configuration will cancel the linkis.out log output. - - - -```` \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_display_dashboard_privew_en.md b/visualis_docs/en_US/Visualis_display_dashboard_privew_en.md deleted file mode 100644 index 7be78f558..000000000 --- a/visualis_docs/en_US/Visualis_display_dashboard_privew_en.md +++ /dev/null @@ -1,49 +0,0 @@ -> Visualis Display and Dashboard preview mechanism - -## 1 Introduction -    The preview mechanism of Display and Dashboard provides the function of previewing the mail to be sent. In use, after the development of Display and Dashboard is completed, click the preview button in the toolbar above the component, and the browser will create a new tab and open the preview page. When the page is fully opened, you can see the final image effect. The following figure is the final preview effect after Display development is completed, that is, the rendering effect of the final email report. -![Preview result](../images/preview_page.png) - -## 2. Design principle -    The Visualis backend provides a preview interface, which is divided into two usage scenarios, the first is the front-end preview function that supports Visualis, and the second is when the DSS workflow is connected, Display and Dashboard execute interface to call. The request value is mainly the primary key ID of Display and Dashboard, and the return value is the output stream of the image. -![Preview overall process](../images/preview.png) -    Display preview and Dashboard preview interface are similar, the preview interface of Dashboard can view the previewPortal method of DashboardPreviewController class in the source code, but the preview of Dashboard has multiple panel pages, and the images are aggregated, other logic Basically the same, the preview interface code of Display: -````java - @MethodLog - @GetMapping(value = "/{id}/preview", produces = MediaType.IMAGE_PNG_VALUE) - @ResponseBody - public void previewDisplay(@PathVariable Long id, - @RequestParam(required = false) String username, - @CurrentUser User user, - HttpServletRequest request, - HttpServletResponse response) throws IOException { - Display display = displayMapper.getById(id); - Project project = projectMapper.getById(display.getProjectId()); - - FileInputStream inputStream = null; - try { - List imageFiles = scheduleService.getPreviewImage(user.getId(), "display", id); - File imageFile = Iterables.getFirst(imageFiles, null).getImageFile(); - if(null != imageFile) { - inputStream = new FileInputStream(imageFile); - response.setContentType(MediaType.IMAGE_PNG_VALUE); - IOUtils.copy(inputStream, response.getOutputStream()); - } else { - log.error("Execute display failed, because image file is null."); - response.sendError(504, "Execute display failed, because image file is null."); - } - } catch (Exception e) { - log.error("display preview error: ", e); - } finally { - if(null != inputStream) { - inputStream.close(); - } - } - } -```` -    The core of preview is to take screenshots of Display page and Dashboard page. Its main function relies on the implementation of PhantomJS. Visualis uses Java's Selenium library to call PhantomJS to take screenshots, and its core logic is implemented in the ScreenshotUtil class . The screenshot needs to rely on the binary file named phantomjs in the bin directory. This is the Driver driver provided by Selenium for PhantomJS, and its related packages can be downloaded from the Selenium official website. -    Since PhantomJS is in an unmaintained state, there is a possibility of migrating to Chrome in the future. You can also download the corresponding driver on the Selenium official website, but to use Chrome, you need to install the real Chrome browser on the Linux machine , if you want to switch to Chromer, you need to perform adaptation testing and compatibility testing. - -## 3. Preview optimization -    In the actual production and use, the occasional scene will appear the screenshot of the wrong page execution error, resulting in the occasional report as an error result when the email is sent. This is a production problem in the usage scenario. In order to solve this problem, we introduce a failure tag monitoring mechanism, add **WidgetExecuteFailedTag** front-end tag elements to the front and back ends, and detect them by the back end. -![Preview result](../images/preview_bug_fix_1.png) \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_dss_integration_en.md b/visualis_docs/en_US/Visualis_dss_integration_en.md deleted file mode 100644 index 41058c65c..000000000 --- a/visualis_docs/en_US/Visualis_dss_integration_en.md +++ /dev/null @@ -1,36 +0,0 @@ -> Visualis access to DSS/Linkis attention points - -## 1. How to use Linkis to connect to Hive data source - -    Before you start using Visualis, execute the following SQL to insert data to automatically adapt the Hive data source through Linkis: - -````sql -INSERT INTO `source` (id,name,description,config,type,project_id,create_by,create_time,update_by,update_time,parent_id,full_parent_id,is_folder,`index`) VALUES(1,'hiveDataSource','','{" parameters":"","password":"","url":"test","username":"hiveDataSource-token"}','hive',-1,null,null,null,null,null, null,null,null); - -```` -Note: If you want to use the metadata browsing function of Hive in the View editing interface, you need to rely on the metadata module of Linkis. - -## 2. How to use the full functionality native to the Davinci project - -    Visualis is implemented based on the open source project Davinci, but in the scenario of embedding DSS, in order to ensure compatibility, the native function of Davinci has been chosen. -    When using Visualis as a BI system alone, you can also access the full functionality of Davinci by adding URL parameters: -````url -http://ip:port/dws/visualis/#/projects?withHeader=true -```` - -## 3. How to use the custom variable function - -    Support to define variables on the interface in Davinci way. -    Define global variables in the DSS console. -    When quoting, use the format of ${variable name}, such as: -````sql -select * from students where class = ${className} -```` - -## 4. How to send charts by email - -    The Display/Dashboard node in DSS will obtain the screenshot corresponding to the chart from the Visualis system when sending the content as an email. To ensure that the screenshot function is normal, you need to check the following points: -1. Make sure the DSS installation directory has the dss-appconn Sendemail AppConn directory. -1. According to the default_browser configured in application.yml, confirm that the corresponding selenium driver has been placed in the directory of the deployed server. -1. Confirm that the selenium driver has been configured in the phantomjs_path or chromedriver_path of application.yml (the default is the bin directory of the installation path). -1. Confirm that the user who started Visualis has execute permission on the selenium driver file. \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_linkisdatasource_en.md b/visualis_docs/en_US/Visualis_linkisdatasource_en.md deleted file mode 100644 index 69e222677..000000000 --- a/visualis_docs/en_US/Visualis_linkisdatasource_en.md +++ /dev/null @@ -1,226 +0,0 @@ -> Visualis Access Linkis Datasource Design Manual - -## 1. Original intention -    The original Visualis must rely on a data source to develop View and Wideget. The data source needs to configure the relevant link information, and Visualis can query the corresponding information through the configured link information, provide View development, but the traditional Visualis does not support big data scenarios, or the supported big data scenarios are relatively simple (you can link Hive ThriftServer through JDBC), and within the WeBank enterprise, it provides computing middleware Linkis links to support a variety of big data data sources , and it provides a variety of enterprise-level features. In order to better support big data scenarios, Visualis is compatible with the original JDBC Source and provides Linkis Datasource to link related data sources. Currently, the most commonly used support is Hive Datasource, whose name is HiveDatasource, When creating a new View, it is directly bound to the data source by default. In use, the sidebar will display the library table information that it has permission to, just like a file tree, its library table can be double-clicked to expand. Not limited to Hive Datasource, Visualis supports data source extension at the code level. Among them, the new Presto data source, more access and extension usage methods, need users to discover and explore by themselves. - -## 2. Design Ideas -    HiveDatasource has certain specifications in Visualis. In order to enable each user to log in and use, it can provide a standard Hive data source with a default configuration. When creating a database, it needs to be in the Source in advance Insert a template. In the Davinci.sql file, the following SQL exists: -```sql -DELETE FROM source; -INSERT INTO `source` ( - id, - name, - description, - config, - type, - project_id, - create_by, - create_time, - update_by, - update_time, - parent_id, - full_parent_id, - is_folder, - `index`) -VALUES ( - 1, - 'hiveDataSource', - '', - '{"parameters":"","password":"","url":"test","username":"hiveDataSource-token"}', - 'hive', - -1, - null,null,null,null,null,null,null,null); -``` -    The default insertion primary key is id 1, in order to specify the index of the template in the database, so that the template location can be found when using it next. If there are other situations, the index of the template data source in the database changes, you need to modify the relevant configuration and restart the service. For the relevant configuration of the data source, you can refer to the configuration in the com.webank.wedatasphere.dss.visualis.utils.VisualisUtils class. When using it, you only need to configure the corresponding key-value pair in the linkis.properties file. The configuration related to the data source can be referred to as follows: -```scala - // hive datasource token value - val HIVE_DATA_SOURCE_TOKEN = CommonVars("wds.dss.visualis.hive.datasource.token","hiveDataSource-token") - // hive datasource primary key id - val HIVE_DATA_SOURCE_ID = CommonVars("wds.dss.visualis.hive.datasource.id",1) - // presto data source token - val PRESTO_DATA_SOURCE_TOKEN = CommonVars("wds.dss.visualis.presto.datasource.token","prestoDataSource-token") - // presto data source token - val PRESTO_DATA_SOURCE_ID = CommonVars("wds.dss.visualis.presto.datasource.id",210) -``` -    When the data source is created, it occurs when the data source information is obtained. When logging in to Visualis and switching to the Source's Tab, the front-end interface will trigger the acquisition of the Source's list interface. Its Restful interface is in the SourceController class, and the code is as follows. -```java - // original Davinci interface - @MethodLog - @GetMapping - public ResponseEntity getSources(@RequestParam Long projectId, - @CurrentUser User user, - HttpServletRequest request) { - if (invalidId(projectId)) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - List sources = sourceService.getSources(projectId, user, HttpUtils.getUserTicketId(request)); - return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(sources)); - } -``` -    In the SourceController class, we have not modified other Davinci related implementations. In order to be compatible with the data source reuse logic, the interfaces in SourceServive have been modified. In the Service, there are three-step logic, Respectively, the corresponding Source list under the project is obtained through the project id, and the Source list is traversed to determine whether there is a Hive data source or a Presto data source. To the totalSource in the final list that needs to be returned, the code is as follows: -```java - @Override - public List getSources(Long projectId, User user, String ticketId) throws NotFoundException, UnAuthorizedExecption, ServerException { - ProjectDetail projectDetail = null; - try { - projectDetail = projectService.getProjectDetail(projectId, user, false); - } catch (NotFoundException e) { - throw e; - } catch (UnAuthorizedExecption e) { - return null; - } - - // 1. Get the relevant data source through the project id - List sources = sourceMapper.getByProject(projectId); - List totalSources = Lists.newArrayList(); - totalSources.addAll(hiveDBHelper.sourcesToHiveSources(sources)); - if (!CollectionUtils.isEmpty(totalSources)) { - ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); - if (projectPermission.getSourcePermission() == UserPermissionEnum.HIDDEN.getPermission()) { - sources = null; - } - } - - // 2. Identify the type and existence of the data source - if(sources.stream().noneMatch(s -> VisualisUtils.isLinkisDataSource(s))){ - - // 3. Insert the data source - Source hiveSource = sourceMapper.getById(VisualisUtils.getHiveDataSourceId()); - hiveSource.setId(null); - hiveSource.setProjectId(projectId); - sourceMapper.insert(hiveSource); - totalSources.add(hiveDBHelper.sourceToHiveSource(hiveSource)); - } - if(getAvailableEngineTypes(user.username).contains(VisualisUtils.PRESTO().getValue()) && sources.stream().noneMatch( - s -> VisualisUtils.isPrestoDataSource(s))){ - - // 3. Insert the data source - Source prestoSource = sourceMapper.getById(VisualisUtils.getPrestoDataSourceId()); - prestoSource.setId(null); - prestoSource.setProjectId(projectId); - sourceMapper.insert(prestoSource); - totalSources.add(hiveDBHelper.sourceToHiveSource(prestoSource)); - } - if(getAvailableEngineTypes(user.username).contains(VisualisUtils.PRESTO().getValue()) && sources.stream().noneMatch( - s -> VisualisUtils.isPrestoDataSource(s))){ - - // 3. Insert the data source - Source prestoSource = sourceMapper.getById(VisualisUtils.getPrestoDataSourceId()); - prestoSource.setId(null); - prestoSource.setProjectId(projectId); - sourceMapper.insert(prestoSource); - totalSources.add(hiveDBHelper.sourceToHiveSource(prestoSource)); - } - return totalSources; - } -``` -    When the data source is used, it relies on the Linkis service. Linkis provides a data source acquisition interface, which shields the difficulty of third-party components from obtaining Hive Metasource-related information. Linkis provides a data source interface and returns its Standardized library table information format. Here, Visualis only needs to define the interface request and parsing format, and can quickly integrate the usage scenarios of big data. When requesting Linkis data source, GateWay needs to forward it and set the corresponding cookie value, namely linkis ticket id. The interface returned by the request is in JSON format. When using it, the JSON string needs to be parsed. The core of the relevant code is as follows: -```java -public class HttpUtils { - - // linkis gateway related interface - private static final String GATEWAY_URL = CommonConfig.GATEWAY_PROTOCOL().getValue() + - CommonConfig.GATEWAY_IP().getValue() + ":" + CommonConfig.GATEWAY_PORT().getValue(); - - // request db information interface - private static final String DATABASE_URL = GATEWAY_URL + CommonConfig.DB_URL_SUFFIX().getValue(); - - // Request table information interface - private static final String TABLE_URL = GATEWAY_URL + CommonConfig.TABLE_URL_SUFFIX().getValue(); - - // request column information interface - private static final String COLUMN_URL = GATEWAY_URL + CommonConfig.COLUMN_URL_SUFFIX().getValue(); - - public static String getDbs(String ticketId) { - // ... - HttpGet httpGet = new HttpGet(DATABASE_URL); - BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); - cookie.setVersion(0); - cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); - cookie.setPath("/"); - cookie.setExpiryDate(new Date(System.currentTimeMillis() + 1000 * 60 * 60 * 24 * 30L)); - cookieStore.addCookie(cookie); - String hiveDBJson = null; - try { - CloseableHttpResponse response = httpClient.execute(httpGet); - hiveDBJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (IOException e) { - logger.error("Failed to obtain Hive database information through HTTP, reason:", e); - } - return hiveDBJson; - } - - public static String getTables(String ticketId, String hiveDBName) { - // ... - String tableJson = null; - try { - URIBuilder uriBuilder = new URIBuilder(TABLE_URL); - uriBuilder.addParameter("database", hiveDBName); - CookieStore cookieStore = new BasicCookieStore(); - CloseableHttpClient httpClient = HttpClientBuilder.create().setDefaultCookieStore(cookieStore).build(); - HttpGet httpGet = new HttpGet(uriBuilder.build()); - BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); - cookie.setVersion(0); - cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); - cookie.setPath("/"); - cookieStore.addCookie(cookie); - CloseableHttpResponse response = httpClient.execute(httpGet); - tableJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (URISyntaxException e) { - logger.error("{} url is wrong", TABLE_URL, e); - } catch (IOException e) { - logger.error("Failed to get the table below hive database {}", hiveDBName, e); - } - return tableJson; - } - - public static String getColumns(String dbName, String tableName, String ticketId) { - // ... - String columnJson = null; - try { - URIBuilder uriBuilder = new URIBuilder(COLUMN_URL); - uriBuilder.addParameter("database", dbName); - uriBuilder.addParameter("table", tableName); - CookieStore cookieStore = new BasicCookieStore(); - CloseableHttpClient httpClient = HttpClientBuilder.create().setDefaultCookieStore(cookieStore).build(); - HttpGet httpGet = new HttpGet(uriBuilder.build()); - BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); - cookie.setVersion(0); - cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); - cookie.setPath("/"); - cookieStore.addCookie(cookie); - CloseableHttpResponse response = httpClient.execute(httpGet); - columnJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (final URISyntaxException e) { - logger.error("{} url is wrong", COLUMN_URL, e); - } catch (final IOException e) { - logger.error("Failed to get hive database {}.{} field information", dbName, tableName, e); - } - return columnJson; - } -``` -    The Hive Datasorce usage scenario for configuration, the data source does not provide real execution logic, the binding logic of Visualis is, Widget needs to bind a View, View will bind a Source, Widget During execution, the executed library table information will not be obtained from Souce. In the non-traditional Davinci logic, there will be a View query record SQL. During actual execution, the logic of Widget rendering is submitted by submitting the SQL code. Therefore, the Linkis data source only provides a visual editing time. -    The core fields in the View are as follows: -```json -// view bound sql -select * from default.dwc_vsbi_students_demo - -// Its indicator dimension information -{ - "id":{"sqlType":"INT","visualType":"number","modelType":"value"}, - "name":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "sex":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "age":{"sqlType":"INT","visualType":"number","modelType":"value"}, - "class":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "lesson":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "city":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "teacher":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "score":{"sqlType":"DOUBLE","visualType":"number","modelType":"value"}, - "fee":{"sqlType":"DOUBLE","visualType":"number","modelType":"value"}, - "birthday":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "exam_date":{"sqlType":"STRING","visualType":"string","modelType":"category"} -} -``` -## 3. Others -    At present, if Visualis is used by itself, Visualis supports Hive Datasource to provide tool components for View query. If it is developed through DSS workflow, when the Widget is bound to the upstream table, the data of its Widget It is obtained from the CS service and does not involve specific data sources. Currently, the Visualis code level also integrates the Presto data source to support faster query analysis. If you need to provide support for more data sources, you can refer to Presto and Hive The relevant implementation of the data source. \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_sendemail_en.md b/visualis_docs/en_US/Visualis_sendemail_en.md deleted file mode 100644 index f9d1cc5d0..000000000 --- a/visualis_docs/en_US/Visualis_sendemail_en.md +++ /dev/null @@ -1,96 +0,0 @@ -> Visualis Send Email Design -## 1 Introduction -    The mail function is the data output function provided by DSS, which can be used by dragging and dropping in the workflow. At present, the mail node supports sending Visualis data display nodes, namely Display node and Dashboard node. Currently, the mail sending method adopts the method of sending pictures, and after the configuration is completed, you will receive a picture of the preview effect of Display and Dashboard in the mailbox. In mail sending, DSS uses Spring's mail sending toolkit JavaMailSenderImpl, which is implemented in the SpringJavaEmailSender class. - - -## 2. The implementation process of email sending -    Email sending is the last step in the development of workflow reports. In the SendEmail node, data output is realized by linking the sending item and binding the sending node, and its function depends on the CS service of Linkis. Since the mail node belongs to a class of AppConn, it also has related AppConn instances. Therefore, when configuring mail sending, due to the needs of the mail, you need to configure the following mail configurations, of which enhance_json is the relevant sending configuration item of SendEmail, mainly the IP of the mail server, Port, username, password, protocol. Its related configuration can refer to the following SQL: -```sql -INSERT INTO dss_appconn_instance ( - appconn_id, - label, - url, - enhance_json, - homepage_url, - redirect_url -) VALUES ( - 7, - 'DEV', - 'sendemail', - '{"email.host":"smtp.163.com","email.port":"25","email.username":"xxx@163.com","email.password":"xxxxx", "email.protocol":"smtp"}', - NULL, - NULL -); -``` -    The process of sending emails requires the cooperation of upper and lower nodes. Before SendEmail is executed, the data visualization node has already prepared the relevant sending results when it is executed. On the DSS workflow side, Display and Dashboard execute the actual The above is to request the preview interface. For related implementations, please refer to [Display Dashboard Preview Principle](), and use Linlis's DownloadAction to request a large result set (the pictures we request for preview by default belong to a large result set). Below is the core logic executed in DSS AppConn for Display and Dashboard. -![SendEmail](./../images/sendemail.png) -```scala - private ResponseRef executePreview(AsyncExecutionRequestRef ref, String previewUrl, String metaUrl) - throws ExternalOperationFailedException { -// Some code omitted... -HttpResult metaResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperationMeta, metadataDownloadAction); - String metadata = StringUtils.chomp(IOUtils.toString(metadataDownloadAction.getInputStream(), - ServerConfiguration.BDP_SERVER_ENCODING().getValue())); // Get the output stream data of metadataDownloadAction - ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createPictureResultSetWriter(); - resultSetWriter.addMetaData(new LineMetaData(metadata)); // write result set to CS - resultSetWriter.addRecord(new LineRecord(response)); // write result set to CS - resultSetWriter.flush(); // flush the stream - IOUtils.closeQuietly(resultSetWriter); // close the stream - ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); -// Some code omitted... - } -``` -    After the visualization nodes Dispaly and Dashboard execute the preview, the result set will be written to the CS service of Linkis. With the result to be sent, when SendEmail is executed, it only needs to be obtained from the CS service of Linkis The corresponding content is enough. There are about two core logics in the mail node. First, through the on-line text, the id of each node is obtained from the on-line text CS of the workflow, which is an array of NodeIDs in the code, and then the data is traversed. Get the id of each node task, which is jobIds in the code. The relevant core code is as follows: - -```scala - def getJobIds(refContext: ExecutionRequestRefContext): Array[Long] = { - val contextIDStr = ContextServiceUtils.getContextIDStrByMap(refContext.getRuntimeMap) - val nodeIDs = refContext.getRuntimeMap.get("content") match { - case string: String => JSONUtils.gson.fromJson(string, classOf[java.util.List[String]]) - case list: java.util.List[String] => list - } - if (null == nodeIDs || nodeIDs.length < 1){ - throw new EmailSendFailedException(80003 ,"empty result set is not allowed") - } - info(s"From cs to getJob ids $nodeIDs.") - val jobIds = nodeIDs.map(ContextServiceUtils.getNodeNameByNodeID(contextIDStr, _)).map{ nodeName => - val contextKey = new CommonContextKey - contextKey.setContextScope(ContextScope.PUBLIC) - contextKey.setContextType(ContextType.DATA) - contextKey.setKey(CSCommonUtils.NODE_PREFIX + nodeName + CSCommonUtils.JOB_ID) - LinkisJobDataServiceImpl.getInstance().getLinkisJobData(contextIDStr, SerializeHelper.serializeContextKey(contextKey)) - }.map(_.getJobID).toArray - if (null == jobIds || jobIds.length < 1){ - throw new EmailSendFailedException(80003 ,"empty result set is not allowed") - } - info(s"Job IDs is ${jobIds.toList}.") - jobIds - } -``` -    In the second step, since the job id of the job corresponds to the running result set path in the cs service, the result set path of the task execution can be obtained by calling the fetchLinkisJobResultSetPaths method, and its result set The path path, that is, the task result record stored in the CS service when the task is executed. After obtaining the relevant result set, the mail can be sent. The mail sending is one of the core functions of DSS and is the function of DSS data output. The core code of the interaction between Visualis and DSS report mail is described here. For other related logic, please refer to the related logic of the DSS SendEmail code. -```scala - override protected def generateEmailContent(requestRef: ExecutionRequestRef, email: AbstractEmail): Unit = email match { - case multiContentEmail: MultiContentEmail => - val runtimeMap = getRuntimeMap(requestRef) - val refContext = getExecutionRequestRefContext(requestRef) - runtimeMap.get("category") match { - case "node" => - val resultSetFactory = ResultSetFactory.getInstance - EmailCSHelper.getJobIds(refContext).foreach { jobId => - refContext.fetchLinkisJobResultSetPaths(jobId).foreach { fsPath => - val resultSet = resultSetFactory.getResultSetByPath(fsPath) - val emailContent = resultSet.resultSetType() match { - case ResultSetFactory.PICTURE_TYPE => new PictureEmailContent(fsPath) - case ResultSetFactory.HTML_TYPE => throw new EmailSendFailedException(80003 ,"html result set is not allowed")//new HtmlEmailContent(fsPath) - case ResultSetFactory.TABLE_TYPE => throw new EmailSendFailedException(80003 ,"table result set is not allowed")//new TableEmailContent(fsPath) - case ResultSetFactory.TEXT_TYPE => throw new EmailSendFailedException(80003 ,"text result set is not allowed")//new FileEmailContent(fsPath) - } - multiContentEmail.addEmailContent(emailContent) - } - } - case "file" => throw new EmailSendFailedException(80003 ,"file content is not allowed") //addContentEmail(c => new FileEmailContent(new FsPath(c))) - case "text" => throw new EmailSendFailedException(80003 ,"text content is not allowed")//addContentEmail(new TextEmailContent(_)) - case "link" => throw new EmailSendFailedException(80003 ,"link content is not allowed")//addContentEmail(new UrlEmailContent(_)) - } - } -``` \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_sql_databind_en.md b/visualis_docs/en_US/Visualis_sql_databind_en.md deleted file mode 100644 index ae0d0fc7f..000000000 --- a/visualis_docs/en_US/Visualis_sql_databind_en.md +++ /dev/null @@ -1,54 +0,0 @@ -> Workflow widget node binding DSS result set node - -## 1. Brief introduction - -    Visualis, as a visual report system, has been connected with DSS workflow at present. You can create a visualis node by dragging it for visual development. For the traditional use of visualisation, the visualization component widget needs to create a view like component view to provide the data source of graphic rendering. For widgets, as long as the result set is a structured data set, it can be used as the data source of widgets for visual graphics development. - -## 2. Usage -    If you need to use visualis nodes in DSS, you need to refer to [visualis appconn installation and deployment document] (). At present, the data nodes of DSS, as long as the nodes that can generate structured data result sets, can support successful binding with visualis widget nodes. For the description of binding DSS data nodes to widgets, please refer to the following table: - -|Node name|Task type|remarks| -|-----|-----|-----| -|sql|Spark SQL task|Multiple result sets are not supported| -|pyspark|Python spark task|See remarks| -|hql|Hive SQL task|Multiple result sets are not supported| - -    For SQL nodes and HQL nodes, as long as they are not multi result set queries, after the execution is completed, the result set generated by them is registered in the CS service of linkis and a temporary table is generated. The dataframe result set is registered in the service and stored as a temporary table **Note** when using the pyspark node as the upstream table. When using spark Python to implement data query and as the data source of the widget, you need to generate a dataframe result set and call the show method, where the widget will display the dimension information of a DF. For a table, it belongs to a multi-dimensional table, that is, multiple columns. -```python -df = spark.sql("select * from default.demo") -show(df) -``` -    The following figure shows how to use the visualis node upstream of binding in DSS. -![Widget绑定上游表](./../images/widget_databind_sql.gif) - - -## 3. Implementation principle -    In the DSS workflow, after dragging the data development node, it will generate a_ tmp_ sql_ 5643_ The temporary table of RC1. When dragging a widget node to bind a data development node, in the DSS workflow JSON, the JSON of the widegt node will be set to configure the bindviewkey as the upstream bound data development node nodeid. The DSS backend will find its CS cache table through the bound nodeid, and when requesting to synchronously create a widget node of visualis, pass the CS ID of its CS table as the data source used by the widget node. -    The following figure shows the workflow parameter JSON generated by binding the upstream SQL node when the DSS drags and creates the widget node. Where bindviewkey is the node ID of the upstream SQL node. -```json -{ - "title": "widget_2919", - "bindViewKey": "22418bea-caec-4129-93ba-ce1938274b1c", - "desc": "" -} -``` -    The creation process can be shown in the following figure: -![绑定数据节点](../images/sql_databind.png) - -## 3.1. Implementation details of interfacing with DSS -    DSS supports widget, display and dashboard nodes. Their crud and execution are connected with visualis. The implementation details of connecting with DSS are as follows. -    The logic related to the appconn of visualis that needs to be implemented on the DSS side is: -1. Implement the projectcreationoperation in the specification, which is called when the DSS project is created. Call the project creation interface in the controller of visualis through HTTP. (the function of displaying the project list on the native homepage of visualis according to the DSS permission has not been realized yet) -2. Implement the crud related operation interface of ref in the specification. It is called when the DSS node is created. First determine the specific node type to be created through parameters, and then call the widget, display or widget creation interface in the controller of visualis through HTTP. - * The creation of a widget does not call the default interface of the controller, but specifically defines the /widget/smartcreate interface. In the widgetresultfulapi, it handles some additional logic that needs to be processed after CS access and virtual view transformation. In addition, the widget itself will save the CSID of the current workflow as the basis for querying context information. Therefore, when the CSID itself changes, it is necessary to call the widget/setcontext interface to update the CSID recorded in the corresponding widget, otherwise the upstream table will not be found. - * In visualis, dashboard is actually a multi-layer structure, represented as dashboard portal dashboard. By default, a two-layer interface is created for node docking, that is, under the dashboard portal with the same name as the node, there is only one dashboard with the same name as the node. Therefore, when creating, it is necessary to call the two creation interfaces in turn to make them have the same name and are related to each other. - * Display in visualis is actually a two-layer structure, which is represented by display and display slide. They are one-to-one relationships. Therefore, when creating, you need to call the two creation interfaces in turn and make them interrelated. - * Implement the visualisrefexecutionoperation interface in the specification. When the DSS node executes, call the corresponding interface in visualis through HTTP to obtain the results. - * For widget execution, call the /visualis/widget/{id}/getdata interface in widgetresultfulapi to obtain the widget execution result. This interface is specifically implemented for interfacing with DSS. By parsing the config field of the widget, it simulates the query related parameters of the front-end splicing, and calls the background query interface to obtain the execution results. The result set of widget execution is the result set of spark SQL query it submits. - * To query the display/dashboard, call the preview interface of the corresponding controller to obtain the corresponding screenshot binary file as the record of the result set. The metadata of the result set needs to be obtained again through a special interface. The interface is defined in widgetresultfulapi. The interface format is /widget/{type}/{id}/metadata, where the type of display is display and the type of dashboard is portal. Note the ID here, and the dashboard should pass the ID of the corresponding dashboard portal. The content returned by the metadata interface is the JSON interface, which records the correspondence between the names of all widgets added to the display/dashboard and their fields and update time. - * Implements the operation interface for importing and exporting refs in the specification. When the DSS node performs import and export operations, it calls the corresponding interface in visualis through HTTP. - * The import/export interface in projectrestfulapi is the implementation of the import and export function of visualis. The export interface receives the project ID and the ID of the corresponding widget, display or dashboard as parameters, exports all the information as JSON structure, and returns resourceid and version after uploading to BML. The import interface receives the resourceid and version of the project ID and BML as parameters. After downloading the JSON structure from BML, it restores it to a specific entity and returns the corresponding relationship between the new and old IDs. - * It should be noted that on the appconn side, after successful import, the ID information in the original jobcontent needs to be updated and returned to the workflow for update. -    The visualis side requires the following related modifications: -1. Access SSO specification. Since the visualis front-end and DSS share the user state, it is only necessary to implement SSO when the back-end interfaces call each other. The visualisuserinterceptor needs to be implemented to operate the user information in the HTTP session. Implement visualisssofilterinitializer, which is used to add SSO filter provided by DSS to the link of HTTP request processing of visualis. The modifyhttprequestwrapper is implemented to copy the cookie information provided by the DSS request to the cookie on the visualis side. -2. Front end modification. In order to support unified front-end access in multiple environments, the front-end page captures the parameter env={env} through the URL, converts the parameter into a route label, and puts it into all subsequent interface requests starting from this page, so that the gateway can forward the request to the visualis background instance corresponding to the corresponding dev/prod environment according to the label. diff --git a/visualis_docs/en_US/Visualis_user_manul_en.md b/visualis_docs/en_US/Visualis_user_manul_en.md deleted file mode 100644 index 84ab16f4f..000000000 --- a/visualis_docs/en_US/Visualis_user_manul_en.md +++ /dev/null @@ -1,70 +0,0 @@ -# Visualis working with documentation -## Service entrance -The visualis service is currently provided as a module of dataspherestudio. You can enter the DSS home page and then enter the service according to the following steps. -There are two ways to use visualis services: -1. Enter the workspace and enter the workflow interface through [common functions - enter workflow development]: -a) For personal testing, you can directly create a new workflow. -b) For formal use, it is recommended to create a new cooperation project first, give relevant personnel the permission to edit and view, enter the cooperation project, and then create a new workflow. -c) nter the workflow interface, drag and drop the display, dashboard and widget nodes. After saving, double-click the nodes to jump to the corresponding editing page. -2. Enter the workspace and enter it through [common functions - enter visualis]. The usage habits are consistent with those in DWS. Note: the project, display and dashboard created from this portal cannot be referenced by the workflow, so only view and widget editing are supported. If you need to edit the display and dashboard for the purpose of sending mail, or there is a need for project collaboration, please use the first access method. -## Function overview -### Project / engineering module -There are two ways to create a project: -1. In the workflow module of DSS, create a new project, and visualis will synchronously create a new project with the same name. -2. Create a project through the native functionality of the visualis home page: -### Basic function module -1. View view -2. Widget component -3. Viz visualization -## data source -1. Hive data source, which does not need to be added manually, has been loaded by default. -2. You can add other JDBC data sources manually. -## view -### Add view -1. Access the view list from the left menu bar and click the Add button in the upper right corner: -2. Click a source in the upper left corner and select the corresponding data source (if it is a hive data source, select hivedatasource). After writing SQL in the edit box, click the execute interface in the lower right corner to pre execute. -3. After execution, you can preview the execution results on the result set page below. -4. On the result set page, you can adjust the field type information after switching the tab to the model. (if you want to use a Chinese field name, you can use the select as statement to convert the field to Chinese.) -5. After editing, you can name the view in the upper left corner and click the Save button in the upper right corner to save the view. -## Chart component -### Create chart -1. Enter the widget component list from the left menu bar and select the Add button at the top right. -2. Enter the chart editing interface. From left to right, the interface includes view field bar, chart configuration bar and chart display area. After selecting a view in the upper left corner, you can see that all fields in the view are listed in the view field column on the left. -3. Drag the fields in the field column into the indicators and dimensions in the configuration column to edit and preview the chart. Type fields can be dragged into dimensions, and numerical fields can be dragged into indicators. If you find that there is an error in the division of numeric type and sub type of a field, you can also drag the field directly up and down to the column of another type, so as to directly change the field type. -4. After clicking an indicator, you can switch the aggregation method of the indicator in the drop-down menu (sum by default). -5. After previewing the data in the display area, you can switch the display type of the chart in the configuration area. When you hover over the thumbnail icon, you can see the number of indicator dimensions that the chart type needs to meet. After adjusting according to the prompts, you can complete the editing of the chart. -6. Chart display supports two driving modes, a) perspective driving: in this mode, a chart will be generated for each indicator separately; b) Chart driven: in this mode, all data are displayed in the same chart, but the data must be within the limits of the chart. The two modes can be switched in the area shown in the following figure. -7. After editing the chart, enter the chart component name in the upper left corner and click save in the upper right corner. -### Filtering and sorting -1. It supports dragging any field into the filter box in the configuration bar to further filter the chart results. -2. After dragging, a selection window will appear in the interface, showing all the values of the current field. Check the required value and click save. -3. If there are more complex filtering requirements, you can switch to conditional filtering and customize personalized filtering. -4. click dimension or indicator to select sorting: -5. Note that after configuring the filter sorting options, you must click the Save button in the upper right corner to save. Otherwise, if you exit and enter again, the last configuration will not be retained. -### Chart styles and adjustments -1. In some chart types, the values of each dimension can be displayed in different colors. As shown in the following figure, drag the field into the color box. In the pop-up window, assign different colors to each value and save. -2. You can adjust the appearance of elements such as font, color, label and coordinate axis through the style bar. The optional items of each chart are different. -3. The style adjustment also needs to be saved through the Save button in the upper right corner. -## Visual presentation -### Two display forms -1. Visualis supports dashboard and display. You can select the visualization bar in the left menu bar to enter the selection interface. -2. Among them, the charts of dashboard are organized on the screen in a more orderly and unified form, and provide advanced functions such as chart linkage and global filtering. -3. The display editor has a higher degree of freedom, and supports common typesetting options such as background color, layer order, custom labels, etc., making it easy to customize a large visual screen with more artistic personality. -### DashBoard -1. Click Add dashboard, enter the name, and click Save to find the newly added dashboard in the list. -2. Click the icon to enter the editing interface. In the editing interface, you can create a multi-level directory structure and add sub dashboards to the directory to classify dashboards with different logic. -3. In the sub dashboard, click the Add icon in the upper right corner to select the widget chart component we created earlier and add it to the screen. -4. In step 2, you can configure the refresh interval of report data. The default is manual refresh. You can adjust it to automatic refresh in seconds. -5. Click Save to see that the selected chart has been added to the screen. At this point, you can drag to adjust the size and position of the chart. -6. All operations in the dashboard editing interface will be saved automatically without additional operations. -### Display -1. Click Add display, enter the name, and click Save to find the newly added display in the list. -2. Click the icon to enter the editing interface. In the right column, some basic customization operations for the entire display are supported. -3. Click the chart button in the upper menu bar to select the widget component to add. -4. Click the widget button in the upper menu bar to add some auxiliary widgets. Such as text label, current time, etc. -5. By dragging, you can adjust the position, zoom in and out, etc. of charts and components. If you feel the screen is too small, you can adjust the display scale in the lower right corner of the canvas. -6. Directly click a chart in the canvas, or check a layer in the layer on the right (each chart or widget constitutes a separate layer) to configure the layer independently. -7. For text labels, you can enter text in the configuration bar on the right. -### Sharing and authorization -1. It supports sharing a dashboard, a display or a widget to a third party through links. Note: when opening the sharing link, the third party must have logged in to DSS. -2. In addition to ordinary sharing, visualis also supports authorized sharing for specified users. Only authorized users log in and open the link can see the chart content. \ No newline at end of file diff --git a/visualis_docs/en_US/Visualis_visual_doc_en.md b/visualis_docs/en_US/Visualis_visual_doc_en.md deleted file mode 100644 index 3d3b88b10..000000000 --- a/visualis_docs/en_US/Visualis_visual_doc_en.md +++ /dev/null @@ -1,94 +0,0 @@ -> Virtual view design document - -This document mainly describes that the widget page of visualis system receives one or more metadata information (i.e. JSON format patched view, including field name, field type, original query statement and data source information) as parameters at run time, and carries out dynamic data query architecture adjustment scheme according to the information provided in the selected parameter view. - -The adjustment involves the following aspects: -1. Parameter view and source structures. -2. Widget page transformation. -3. Query logic transformation. -## Parameterized view and source structures -1. Add the following concepts: -a) Virtual view: a view without specific content. The widget bound to the view is the widget that receives view parameters for dynamic rendering. -b) Parameter view: contains the field name, field type, original query statement, data source and other information in the form of JSON, which is passed to the widget editing page as a URL parameter. -c) Parameter source: as a field of the parameter view, it is also a JSON structure. It specifies the engine type (spark, hive, JDBC, etc.), data source type (hive library table, SQL script, linkis result set, etc.), specific content of the data source, and data source (creator, such as scripts) that the view should submit when querying. -d) In the above concept, the widget needs to be able to be bound to a virtual view to receive the parameters of the patched view. -2. The virtual view is a row inserted in the database, and its project_ ID and source_ ID is -1; The SQL, model, variable, and config fields are all null. -3. It only supports the creation of a widget bound to a virtual view when it is called by other systems and parameters are passed. You cannot manually select to bind a virtual view when creating a widget. -4. The parameter view needs to conform to the following JSON format. -a) Correspondence between data source type and specific data content: - -|dataSourceType |dataSourceContent | -|---------------|------------------| -resultset|Result set path -script|BML resource id + version -table|Library table name -context|context id,keyword -url|url -``` -{ - "name": "test_view1", - "model": { - "data_id": { - "sqlType": "STRING", - "visualType": "string", - "modelType": "value" - }, - "ds": { - "sqlType": "STRING", - "visualType": "string", - "modelType": "category" - } - }, - "source": { - "engineType": "spark", //Engine type - - "dataSourceType": "resultset", //Data source type, result set, script, library table - "dataSourceContent": { - "resultLocation": "/tmp/linkis/resultset/_0.dolphin" - }, - "creator": "scriptis" - }, - "params": "[]" -} -``` -1. Parameter transfer mode -a) newly build widget:/dss/visualis/#/project/3/widget/add?views=[{view1 json},{view2 json}] -b) edit widget:/dss/visualis/#/project/3/widget/4?views=[{view1 json},{view2 json}] -c) Consider providing a post interface to directly create a virtual widget, and then open the widget to render -## Widget page transformation -1. Add the following concepts: -a) Virtual widget: refers to the widget bound to the virtual view. The config clearly indicates virtual=true. Add the source field in the config to store the parameter source; The model field of its config stores the field information in the parameter view. -b) Context ID: when it is created as a widget node, the ContextID field is added to the config field of the widget to store the context ID corresponding to the flow where the widget node is located. (the widgets corresponding to all widget nodes are virtual widgets, and virtual=true is set when creating) -2. When opening the new widget interface: -a) If there is no URL parameter, the original logic is maintained. -b) If there is a URL parameter, it is considered that you are ready to create a new virtual widget. If there is only one parameter, select the parameter view. If there are multiple parameters, uncheck them and put them all in the drop-down list. When saving, indicate virtual=true. -3. When the edit widget interface is opened: -a) If there is no URL parameter and the widget is a virtual widget: -i. Check that the model in config is not empty. If yes, query the parameter source directly submitted to config. -ii. If the model in config is empty and the ContextID is not empty, all the upstream metadata (the backend provides interfaces) will be found according to the ContextID as an alternative view for the drop-down list. After selecting and saving the metadata in the context, the source in the config is updated to the corresponding context type and the corresponding key is recorded. -III. If there is nothing, keep the original logic of editing an empty widget page unchanged, and there is nothing meaningful to operate. -b) If there is a URL parameter and the widget is a virtual widget: if there is only one parameter, select the parameter view. If there are multiple parameters, uncheck them and put them all in the drop-down list. -c) No matter whether there is a URL parameter or not, as long as a non virtual normal widget is opened, the original logic remains unchanged. -4. For the GetData and share/data interfaces, under the virtual widget, an additional source parameter should be passed. During back-end processing: -a) If it is a virtual widget and it is not transmitted to the source, it will directly query and report an error; -b) If it is a normal non virtual widget, it will be ignored even if the source is passed. -## Query logic transformation -For the new data sources, the previous query method needs to be modified as follows: -1. Sourceinitializer: initializes the data source and returns more detailed source information after initialization. -a) For the result set data source, generate or update the temp view in spark, and supplement the SQL that selects the temp view in the returned source. -b) For the SQL script data source, pull the corresponding SQL from BML and put it into the returned source. -c) For the hive library table data source, the splice select statement is placed in the returned source. -d) For the CS data source, pull the specific metadata content according to the context ID, and splice the select statement into the returned source. -e) For the URL external data source, the requested data is converted into dolphin format, submitted to spark to create a temp view, and the returned source is supplemented with SQL to select the temp view. -i. The URL data source provider directly provides data in dolphin format. -ii. You can consider preliminarily implementing the conversion from CSV and other common formats to dolphin. -iii. Other formats can be implemented later according to the requirements of docking with other systems. -2. Querystatementgenerator: generates corresponding query statements according to indicator dimension conditions and data source information. -a) The default implementation is sqlquerystatementgenerator, which converts the indicator dimension into each part of the select statement. The from part is the original query contained in the source. -b) Subsequent implementation of other languages on demand. -3. Queryexecution: submits the query to the corresponding engine and is responsible for obtaining the progress, status and result set. -a) In the source jump from the external system, if there is creator information, submit the engine of the creator; If not, it will be submitted to the visualis engine by default. -b) Synchronous query: direct query. The result set is returned directly after the result set is blocked. -c) Asynchronous query: submit the query, return the query ID, provide progress and status tracking, and finally obtain the result set through the ID. -4. Resultparser: converts the original result set returned by the engine into a result set format that can be returned to the front-end rendering. -a) Dolphin visualissresultparser, which converts the result set in dolphin format to the visualiss front-end format. \ No newline at end of file diff --git a/visualis_docs/en_US/visualis_design_en.md b/visualis_docs/en_US/visualis_design_en.md deleted file mode 100644 index 011d9aaed..000000000 --- a/visualis_docs/en_US/visualis_design_en.md +++ /dev/null @@ -1,60 +0,0 @@ -# 1. Functional characteristics -    Based on the DaVinci project, visualis and datasphere studio are combined to realize the following features: -- Chart watermark -- Data quality verification -- Chart presentation optimization -- Interface with the linkis computing middleware -- Scriptis result set one click visualization -- External application parameter support -- Dashboard/display set becomes workflow node of datasphere studio -- Visualis also supports the following native features of DaVinci: data sources -- Supports JDBC data sources -- Supports uploading CSV files -- Data view -- Supports defining SQL templates -- Support SQL highlighting -- Support SQL testing -- Support write back operation -- Visual components -- Supports predefined charts -- Support controller components -- Supports free styles -- Interaction capability -- Support full screen display of visual components -- Support visual component local controller -- Support filtering linkage between visual components -- Support group control controller visual components -- Support visual component local advanced filters -- Support large data display paging and slider -- Integration capability -- Supports CSV downloading of visual components -- Support public sharing of visual components -- Support visual component authorization sharing -- Support dashboard public sharing -- Support dashboard authorization sharing -# 2. Integration with DSS -    Visualis is a data visualization platform solution that provides a one-stop data visualization solution for business personnel, data engineers, data analysts, and data related positions. Users can simply configure different data sources on the front end of the visualization page, realize a set of data visualization applications, support the display of multiple data models, and provide visualization functions such as advanced interaction, industry analysis, pattern exploration, social intelligence, etc. Visualis is seamlessly connected with the data development, workflow scheduling, data quality verification and other modules of datasphere studio to achieve a coherent and smooth user experience in the whole process of data application development. -![](../images/1.png) - - -## 2.1. App store integration - -    Visualis implements the first level specification of DSS, connects to the application store of DSS, supports switching from DSS to visualis, and implements SSO specification, which allows secret free interworking. -![](../images/visualis_dss_1.png) - - - -## 2.2. Workflow integration - -    Visualis implements the secondary and tertiary specifications of DSS, accesses DSS engineering and orchestration (workflow), configures the workflow nodes of DSS, and supports the use of visualis by dragging and dropping in DSS workflow. -![](../images/visualis_dss_2.png) - - -## 3. architecture design - -    It is designed around the two core concepts of view (data view) and widget (visual component). View is a structured form of data, and all logic / permissions / services are expanded from view (as a virtual view in the spark SQL node of DSS workflow). A widget is a visual form of data. All display / interaction / guidance, etc. are carried out from the widget. The following figure shows the functional component modules of visualis. -![](./../../images/architecture.png) - - - - diff --git a/visualis_docs/en_US/visualis_update_en.md b/visualis_docs/en_US/visualis_update_en.md deleted file mode 100644 index 604d5bc45..000000000 --- a/visualis_docs/en_US/visualis_update_en.md +++ /dev/null @@ -1,59 +0,0 @@ -Visualis 1.0.0-rc1 upgrading to 1.0.0 using documentation - ---- - - - -## 1. The upgrade steps are mainly divided into: - -- Service stop -- Execute database upgrade script -- Replace the visualis deployment directory with a new version package -- Add and modify configuration files -- Service startup - -#### 1. Service stop - -Enter the deployment directory of Visualis, and execute the command under the directory to stop the services of Visualis: -```shell -cd ${VISUALIS_INSTALL_PATH} -sh bin/stop-visualis-server.sh -``` - -#### 2. Execute database upgrade SQL script - -After linking the visualis database, execute the following SQL: -```sql -alter table linkis_user rename to visualis_user; -``` - -#### 3. Replace the visualis deployment directory with a new version package - -- Back up the deployment directory of the old version of visualis. Take this directory as an example: -```shell -mv /appcom/Install/VisualisInstall/lib /appcom/Install/VisualisInstall/lib-bak -``` -- Refer to [visualis installation and deployment document](./visualis_deploy_doc_cn.md). After compiling and packaging, replace lib. - - - -#### 4. Modify configuration - -- Visualis1.0.0-rc1 version is compatible with cookies in order to be compatible with dss1.0.1 and linkis1.1.1. You need to delete the following parameters and use the linkers configured by default in the code: linkis_user_session_ticket_id_v1 value. - -```properties -#Delete the following configuration -wds. linkis. session. ticket. key=bdp-user-ticket-id -wds. dss. visualis. ticketid=bdp-user-ticket-id - -``` -- After the configuration modification is completed, you need to reinstall visualis appconn on the DSS side. To install visualis1.0.0 appconn, refer to [visualis appconn installation](./visualis_appconn_install_cn.md). - - - - -#### 5. Service startup -    Now you can start the new version of Visualis services. Execute the command to start the services: -```shell -sh bin/start-visualis-server. sh -``` \ No newline at end of file diff --git a/visualis_docs/en_US/visualis_use_doc_en.md b/visualis_docs/en_US/visualis_use_doc_en.md deleted file mode 100644 index 8ecd86b85..000000000 --- a/visualis_docs/en_US/visualis_use_doc_en.md +++ /dev/null @@ -1,18 +0,0 @@ -> Working with documentation -## 1. Basic usage documentation -    Visualis is a data-based BI product developed based on Davinci, which supports the original [Davinci user usage](https://edp963.github.io/davinci/), on this basis, Visualis provides more additional functionality points. Mainly, result set visualization, workflow usage, email usage. - -## 2. Result set visualization -    Visualis supports interactive script analysis for docking with DSS. After the script runs, the result set of the script can be visually analyzed, and the result set will be automatically bound to a default Widget, which supports simple drag-and-drop Drag and drop can realize the development of Widget. -![](./../images/visualis_scriptis_visualis.gif) - -## 3. Workflow usage -    Visualis is connected to the DSS workflow. When creating a project on the DSS side, the Visualis project will be created synchronously. In the workflow, drag and drop the Visualis node, and the corresponding components will also be created in the project. When using a Widget in a workflow, the Widget needs to bind an upstream table as a data source to develop visual graphics. For the related implementation principle, please refer to [Widget Node Binding DSS Result Set Node](./Visualis_sql_databind_cn.md), by dragging and dropping Widget, Display, and Dashboard three components, connected into a line, can realize a visual report. -![](./../images/visualis_workflow.gif) -    Currently, Visualis1.0.0 has added a View node to DSS1.1.0, which is similar to the Sql node, but when used in conjunction with the Widget node, select non-binding, double-click to enter the Widget, in the View Select Save in the selection bar. - -## 4. Mail usage -    DSS provides data output nodes. When deploying and installing DSS, you need to configure the relevant mail server configuration. Before using mail, you need to ensure the availability of the mail server. Line and rely on the visualization node, configure the relevant mail options, you can send mail, the final effect of mail sending can be viewed through the preview interface of Display and Dashboard. -![](./../images/dss_sendemail.gif) - -For some usage precautions, please refer to [Visualis access to DSS/Linkis precautions](./Visualis_dss_integration_cn.md). \ No newline at end of file diff --git a/visualis_docs/images/appconn.png b/visualis_docs/images/appconn.png deleted file mode 100644 index 67a42a132..000000000 Binary files a/visualis_docs/images/appconn.png and /dev/null differ diff --git a/visualis_docs/images/dss_sendemail.gif b/visualis_docs/images/dss_sendemail.gif deleted file mode 100644 index ef696c2ff..000000000 Binary files a/visualis_docs/images/dss_sendemail.gif and /dev/null differ diff --git a/visualis_docs/images/preview.png b/visualis_docs/images/preview.png deleted file mode 100644 index 660bd5607..000000000 Binary files a/visualis_docs/images/preview.png and /dev/null differ diff --git a/visualis_docs/images/preview_bug_fix_1.png b/visualis_docs/images/preview_bug_fix_1.png deleted file mode 100644 index e7d44f238..000000000 Binary files a/visualis_docs/images/preview_bug_fix_1.png and /dev/null differ diff --git a/visualis_docs/images/preview_page.png b/visualis_docs/images/preview_page.png deleted file mode 100644 index 243f699cc..000000000 Binary files a/visualis_docs/images/preview_page.png and /dev/null differ diff --git a/visualis_docs/images/sendemail.png b/visualis_docs/images/sendemail.png deleted file mode 100644 index 15cf10c13..000000000 Binary files a/visualis_docs/images/sendemail.png and /dev/null differ diff --git a/visualis_docs/images/sql_databind.png b/visualis_docs/images/sql_databind.png deleted file mode 100644 index 22eea53b1..000000000 Binary files a/visualis_docs/images/sql_databind.png and /dev/null differ diff --git a/visualis_docs/images/visualis_appconn.jpg b/visualis_docs/images/visualis_appconn.jpg deleted file mode 100644 index f60f989bf..000000000 Binary files a/visualis_docs/images/visualis_appconn.jpg and /dev/null differ diff --git a/visualis_docs/images/visualis_appconn_fix.png b/visualis_docs/images/visualis_appconn_fix.png deleted file mode 100644 index a900aef3d..000000000 Binary files a/visualis_docs/images/visualis_appconn_fix.png and /dev/null differ diff --git a/visualis_docs/images/visualis_dashboard_1.png b/visualis_docs/images/visualis_dashboard_1.png deleted file mode 100644 index d97440696..000000000 Binary files a/visualis_docs/images/visualis_dashboard_1.png and /dev/null differ diff --git a/visualis_docs/images/visualis_display_1.png b/visualis_docs/images/visualis_display_1.png deleted file mode 100644 index 3b0122209..000000000 Binary files a/visualis_docs/images/visualis_display_1.png and /dev/null differ diff --git a/visualis_docs/images/visualis_dss_1.png b/visualis_docs/images/visualis_dss_1.png deleted file mode 100644 index a75cba62e..000000000 Binary files a/visualis_docs/images/visualis_dss_1.png and /dev/null differ diff --git a/visualis_docs/images/visualis_dss_2.png b/visualis_docs/images/visualis_dss_2.png deleted file mode 100644 index 902e07894..000000000 Binary files a/visualis_docs/images/visualis_dss_2.png and /dev/null differ diff --git a/visualis_docs/images/visualis_eureka.png b/visualis_docs/images/visualis_eureka.png deleted file mode 100644 index 772b6f42a..000000000 Binary files a/visualis_docs/images/visualis_eureka.png and /dev/null differ diff --git a/visualis_docs/images/visualis_scriptis_visualis.gif b/visualis_docs/images/visualis_scriptis_visualis.gif deleted file mode 100644 index 75f696dbc..000000000 Binary files a/visualis_docs/images/visualis_scriptis_visualis.gif and /dev/null differ diff --git a/visualis_docs/images/visualis_sendemail_1.png b/visualis_docs/images/visualis_sendemail_1.png deleted file mode 100644 index 3e5bdb702..000000000 Binary files a/visualis_docs/images/visualis_sendemail_1.png and /dev/null differ diff --git a/visualis_docs/images/visualis_source.png b/visualis_docs/images/visualis_source.png deleted file mode 100644 index 008ec3f0a..000000000 Binary files a/visualis_docs/images/visualis_source.png and /dev/null differ diff --git a/visualis_docs/images/visualis_view_1.png b/visualis_docs/images/visualis_view_1.png deleted file mode 100644 index 81d371ab7..000000000 Binary files a/visualis_docs/images/visualis_view_1.png and /dev/null differ diff --git a/visualis_docs/images/visualis_view_2.png b/visualis_docs/images/visualis_view_2.png deleted file mode 100644 index 9dcf47852..000000000 Binary files a/visualis_docs/images/visualis_view_2.png and /dev/null differ diff --git a/visualis_docs/images/visualis_view_3.png b/visualis_docs/images/visualis_view_3.png deleted file mode 100644 index 7e24b1f7e..000000000 Binary files a/visualis_docs/images/visualis_view_3.png and /dev/null differ diff --git a/visualis_docs/images/visualis_widget_1.png b/visualis_docs/images/visualis_widget_1.png deleted file mode 100644 index e27056a4e..000000000 Binary files a/visualis_docs/images/visualis_widget_1.png and /dev/null differ diff --git a/visualis_docs/images/visualis_widget_2.png b/visualis_docs/images/visualis_widget_2.png deleted file mode 100644 index e098a9364..000000000 Binary files a/visualis_docs/images/visualis_widget_2.png and /dev/null differ diff --git a/visualis_docs/images/visualis_workflow.gif b/visualis_docs/images/visualis_workflow.gif deleted file mode 100644 index 922f6b263..000000000 Binary files a/visualis_docs/images/visualis_workflow.gif and /dev/null differ diff --git a/visualis_docs/images/widget_databind_sql.gif b/visualis_docs/images/widget_databind_sql.gif deleted file mode 100644 index 5396d6cce..000000000 Binary files a/visualis_docs/images/widget_databind_sql.gif and /dev/null differ diff --git a/visualis_docs/zh_CN/Visualis_Davinci_difference_cn.md b/visualis_docs/zh_CN/Visualis_Davinci_difference_cn.md index 29b2a1064..cd70edc77 100644 --- a/visualis_docs/zh_CN/Visualis_Davinci_difference_cn.md +++ b/visualis_docs/zh_CN/Visualis_Davinci_difference_cn.md @@ -2,47 +2,45 @@ ## 1. 自定义变量格式 -    Davinci的自定义变量格式默认为双美元符号的方式,并支持在配置中对默认格式进行修改,而Visualis中,变量一律为${variableName}格式,且无法修改,此格式与Linkis的自定义变量一致。如: -````sql -# davinci引入变量 -select * from students where class = $variableName$ +Davinci的自定义变量格式默认为$variableName$的方式,并支持在配置中对默认格式进行修改,而Visualis中,变量一律为${variableName}格式,且无法修改,此格式与Linkis的自定义变量一致。如: -# 对接linkis后引入变量的方式 +````sql select * from students where class = ${className} ```` ## 2. 组织与权限功能 -    在将Visualis作为DSS的内嵌模块使用时,组织和权限功能被移除了。如果需要单独使用与Davinci一致的组织和权限功能,可以通过以下url参数的形式,在单独的页面访问Visualis。 +    在将Visualis作为DSS的内嵌模块使用时,组织和权限功能被移除了。如果需要单独使用与Davinci一致的组织和权限功能,可以通过以下url参数的形式,在单独的页面访问Visualis。 ````url http://ip:port/dws/visualis/#/projects?withHeader=true ```` ## 3. 邮件定时发送功能 -    DataSphere Studio的工作流中,提供了SendEmail节点,支持将Visualis中的Dashboard和Display作为邮件发送内容。 +    DataSphere Studio的工作流中,提供了SendMail节点,支持将Visualis中的Dashboard和Display作为邮件发送内容。 +    Davinci的原有的邮件定时任务功能,在Visualis中保持不变。 ## 4. Dashboard与Display的预览功能 -    出于用户需要对邮件实际发送的图片进行验证的需求,Visualis将Dashboard/Display编辑界面上的预览按钮跳转的页面,变为了显示该Dashboard/Display的实际截图。 +    出于用户需要对邮件实际发送的图片进行验证的需求,Visualis将Dashboard/Display编辑界面上的预览按钮跳转的页面,变为了显示该Dashboard/Display的实际截图。 ## 5. 用户管理与登录 -    不再支持Davinci原生的登录和用户管理方式。Visualis与DataSphere Studio共享用户session,从DSS的登录页面登录后,即可无缝跳转到Visualis。 +    不再支持Davinci原生的登录和用户管理方式。Visualis与DataSphere Studio共享用户session,从DSS的登录页面登录后,即可无缝跳转到Visualis。     在数据库层面,Visualis的用户改为从linkis_user表中读取。 ## 6. 项目 -    与Davinci不同,Visualis的项目可以没有所属组织,允许只属于个人的项目存在。 -    Visualis的项目与DSS的项目保持完全同步,在数据库层面,从visualis_project表中读取。 +    与Davinci不同,Visualis的项目可以没有所属组织,允许只属于个人的项目存在。 +    Visualis的项目与DSS的项目保持完全同步,在数据库层面,从dss_project表中读取。 ## 7. SQL分割提交 -    Davinci中通过JDBC执行SQL时,如果一个View中包含多个SQL语句,这些语句将被按顺序分隔,每次仅提交执行一条语句。 -    Visualis中,通过JDBC执行的逻辑保持不变。但通过Linkis提交Spark-SQL对Hive数据源进行查询时,为了保证同一个View的SQL被提交到同一个引擎执行,在Visualis中不再对SQL语句进行分隔,即每个View中的语句将被一起提交给Linkis,在分配给具体的引擎进行执行后,由引擎按顺序分割执行。 +    Davinci中通过JDBC执行SQL时,如果一个View中包含多个SQL语句,这些语句将被按顺序分隔,每次仅提交执行一条语句。 +    Visualis中,通过JDBC执行的逻辑保持不变。但通过Linkis提交Spark-SQL对Hive数据源进行查询时,为了保证同一个View的SQL被提交到同一个引擎执行,在Visualis中不再对SQL语句进行分隔,即每个View中的语句将被一起提交给Linkis,在分配给具体的引擎进行执行后,由引擎按顺序分割执行。 + + + -## 8. 与DSS工作流打通 -    Davinci不支持工作流调度。 -    DSS支持拖拽式开发Visualis可视化报表,支持与DSS数据开发节点协调widget, display, dashboard节点开发。并且可以一键发布执行调度,并发送邮件。 diff --git a/visualis_docs/zh_CN/Visualis_appconn_install_cn.md b/visualis_docs/zh_CN/Visualis_appconn_install_cn.md deleted file mode 100644 index e82d3bcfd..000000000 --- a/visualis_docs/zh_CN/Visualis_appconn_install_cn.md +++ /dev/null @@ -1,51 +0,0 @@ -> Visualis AppConn安装 - -## 1. AppConn安装 -    DSS1.1.0的第三方组件AppConn归属于第三方组件自己维护,所以为了成功安装visualis,并支持DSS工作流,需要拉取visualis1.0.0代码,编译打包AppConn代码。 -```shell -# 进入visualis源码项目中 -cd visualis - -# 进入visualis-appconn模块 -cd visualis-appconn - -mvn clean package -DskipTests=ture -``` -    如下图visualis.zip包即为visualis-appconn的包。 -![](./../images/visualis_appconn.jpg) - - -    如果是使用[DSS一键安装全家桶](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/1.1.0/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/DSS%26Linkis%E4%B8%80%E9%94%AE%E9%83%A8%E7%BD%B2%E6%96%87%E6%A1%A3%E5%8D%95%E6%9C%BA%E7%89%88.md)来部署的服务,可以直接使用其软件包中提供的脚本工具。在一键全家桶部署完成后,可以在dss的安装目录下找到脚本工具,其目录结构和使用说明如下。 -```shell -# 进入到dss安装的bin目录下 ->> cd dss/bin - -# 其中appconn-install.sh就是AppConn安装脚本工具 ->> ls ->> appconn-install.sh appconn-refresh.sh checkEnv.sh executeSQL.sh install.sh start-default-appconn.sh -``` -    为了能够安装顺利,首先需要部署完成Visualis服务,然后需要把visualis appconn的zip包放置到规定的appconn目录解压。Visualis的安装部署可以参考[Visualis安装部署文档](./Visualis_deploy_doc_cn.md),visualis appconn zip包放置和AppConn安装脚本工具步骤如下: -```shell -# 把visualis appconn放置到dss-appconns目录下 -rz -ybe ${DSS_INSTALL_HOME}/dss/dss-appconns - -# 解压Visualis AppConn包 -unzip visualis.zip - -cd {DSS_INSTALL_HOME}/dss/bin - ->> sh appconn-install.sh - -# 输入Visualis名称 ->> visualis - -# 输入Visualis前端IP地址 ->> 127.0.0.1 - -# 输入Visualis服务的前端端口后 ->> 8088 - -# 在执行AppConn安装脚本工具后,会插入相关第三方AppConn的配置信息 -``` -    修改完成后需要重启DSS服务。 -    如果是域名的方式访问DSS服务,需要参考[visualis安装部署文档](./Visualis_deploy_doc_cn.md)的第5小节。 diff --git a/visualis_docs/zh_CN/Visualis_deploy_doc_cn.md b/visualis_docs/zh_CN/Visualis_deploy_doc_cn.md index e77888030..e05b5ec27 100644 --- a/visualis_docs/zh_CN/Visualis_deploy_doc_cn.md +++ b/visualis_docs/zh_CN/Visualis_deploy_doc_cn.md @@ -1,291 +1,82 @@ -Visualis编译部署文档 ------- +> Visualis的单独安装 -# 1. 环境准备及编译 +## 1. 获取安装包并安装 -## 1.1. 依赖环境准备 -| 依赖的组件 | 是否必装 | 安装直通车 | -| -------------- | ------ | --------------- | -| MySQL (5.5+) | 必装 | [如何安装mysql](https://www.runoob.com/mysql/mysql-install.html) | -| JDK (1.8.0_141) | 必装 | [如何安装JDK](https://www.runoob.com/java/java-environment-setup.html) | -| Hadoop(2.7.2,Hadoop 其他版本需自行编译 Linkis) | 必装 | [Hadoop单机部署](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) ;[Hadoop分布式部署](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| Spark(2.4.3,Spark 其他版本需自行编译 Linkis) | 必装 | [Spark快速安装](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| DSS1.1.0 | 必装 | [如何安装DSS](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/1.1.0/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/DSS%26Linkis%E4%B8%80%E9%94%AE%E9%83%A8%E7%BD%B2%E6%96%87%E6%A1%A3%E5%8D%95%E6%9C%BA%E7%89%88.md) | -| Linkis1.1.1(大于等于该版本) | 必装 | [如何安装Linkis](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| Nginx | 必装 | [如何安装 Nginx](http://nginx.org/en/linux_packages.html) | +    通过在我们的release安装包里拿到对应模块的安装包: -## 1.2. 创建 Linux 用户 - -    请保持Visualis的部署用户与Linkis的部署用户一致,采用hadoop用户部署。 - -## 1.3. 底层依赖组件检查 -    **在安装linkis后,请确保DSS1.1.0与Linkis1.1.1 基本可用,可在 DSS 前端界面执行 SparkQL 脚本,可正常创建并执行 DSS 工作流。** - -## 1.4. 下载源码包及编译后端 -    Visualis源码安装时,需要下载对应的源码包进行编译,目前Visualis在依赖的Linkis1.1.1版本已经上传到Maven中央仓库,只需Maven配置正常即可拉取相关依赖,**DSS 1.1.0版本正在发布版本,并未上传至Maven中央仓库,需要拉取DSS仓库的1.1.0进行编译,并把依赖安装到本地。** - -```shell -# 1. 下载源码 -git clone https://github.com/WeBankFinTech/Visualis.git - -# 2. 切换到1.0.0分支 -git checkout 1.0.0 - -# 3. 执行编译打包 -cd Visualis -mvn -N install -mvn clean package -DskipTests=true -``` - -## 1.5. 编译前端 -    Visualis是一个前后端分离项目,前端文件可以单独编译打包,在电脑上需要安装npm工具,可以查看[npm安装](https://nodejs.org/en/download/),在windowns机器上,可以打开Idea工具的Terminal界面,或者使用Git bash完成前端编译。 -```shell -# 查看npm是否安装完成 -npm -v ->> 8.1.0 - -cd webapp # 进入前端文件路径 -npm i # 下载前端依赖 -npm run build # 编译前端包 - -# 在webapp目录下会生成一个build文件目录,该目录即编译完成的前端包文件 - -# 在windows环境,压缩build目录为一个zip文件即可 -``` - -## 2. 安装部署 -## 2.1. 安装后端 -    Visualis使用assembly作为打包插件,在编译完成后,进入到Visualis/assembly/target目录下,可以找到编译完成后的visualis-server.zip包。 ````bash -# 1. 解压安装包 + ## 1. 解压安装包 unzip visualis-server.zip cd visualis-server ```` -    解压完成visualis编译包后,进入目录可以看到以下文件目录。 -``` -visualis-server - --- bin # 服务启停脚本 - --- conf # 服务配置目录 - --- davinvi-ui # 前端模板,有无并不影响使用 - --- lib # 服务jar包存放位置 - --- logs # 日志目录 -``` -    在需要部署的服务器上(也可以是DSS部署的服务器),上传该visualis-server.zip包,在需要部署的路径上,解压即可完成Visualis安装。 - -## 2.2. 初始化数据库 -    Visualis的编译包,解压即为安装,并未去执行相关的SQL文件,所以在正常安装步骤中,需要建立一个visualis的数据库,并执行visualis的相关建表语句。 -    相关建表语句可以在源码中找到,进入到源码的根目录,找到db文件夹,连接到对应的数据库后,执行以下SQL文件,建立Visualis使用时需用到的表。 -```shell -# 在源码包db目录中找到对应的sql文件 - -# 连接visualis数据库 -mysql -h 127.0.0.1 -u hadoop -d visualis -P3306 -p - -source ${visualis_home}/davinci.sql -source ${visualis_home}/ddl.sql - -# 其中davinci.sql是visualis需要使用到的davinci的表 -# ddl.sql是visualis额外依赖的表 -``` - - -## 2.3. 字体库安装 -    对于邮件报表而言,需要渲染中文字体,其中Visualis截图功能依赖中文字体,在部署的机器上/usr/share/fonts目录下。新建一个visualis文件夹,上传**Visualis源码包中ext目录下的pf.ttf到该visualis文件夹下**,执行fc-cache –fv命令刷新字体缓存即可。 -```shell -# 需要切换到root用户 -sudo su -cd /usr/share/fonts -mkdir visualis - -# 上传pf.ttf中文字体库 -rz -ybe - -# 刷新字体库缓存 -fc-cache –fv -``` -    在使用visualis时,调用预览功能或在工作流中执行Display和Dashboard时,如果提示报错:**error while loading shared libraries: libfontconfig.so.1: cannot open shared object file: No such file or directory**,是由于部署visualis的机器缺少相关依赖导致报错,执行**sudo yum -y install fontconfig-devel**安装依赖。 - - -## 2.4 安装前端 - -    为了更好的说明前端配置,首先给出nginx的配置,visualis的nginx的前端配置和说明: -```shell -server { - - listen 8088;# a. 访问端口 - server_name localhost; - - location /dss/linkis { # b. linkis管理台的静态文件目录 - root /data/dss_linkis/web; - autoindex on; - } - - location /dss/visualis { # c. 前端访问路径,需要手动创建 - root /data/dss_linkis/web; # d. visualis前端静态资源文件目录,可自由指定 - autoindex off; - } - - location / { # e. dss静态文件目录 - root /data/dss_linkis/web/dist; - index index.html index.html; - } - - location /ws { - proxy_pass http://127.0.0.1:9001; # f. linkis gateway地址 - # ... - } - - location /api { - proxy_pass http://127.0.0.1:9001; # g. linkis gateway地址 - # ... - } -} -``` -    Visualis当前使用前后端分离的部署方案,即前后端分别打包部署,完成前端编译后,把前端包(前端资源文件)放置在nginx配置的前端资源文件目录下 **(即上述配置 c和d小项)。** -```shell -# 配置静态资源根路径(用于配置nginx的root参数,即d小项) -cd /data/dss_linkis/web - -# 在上一步/data/dss_linkis/web目录下,配置前端访问url路径地址(即c小项,没有则需要创建) -cd dss/visualis - -# 上传Visualis前端包 -rz -ybe build.zip - -unzip build.zip # 解压前端包 -cd build # 进入到解压路径 - -mv * ./../ # 把静态资源文件移动到c小项dss/visualis路径下 -``` -    前端部署配置后,可以重启nginx或者刷新nginx配置使得上述配置生效**sudo nginx -s reload。** +## 2. 修改配置 +    包准备好了后,就是修改配置,配置主要修改application.yml和linkis.properties,配置都在conf目录下面 -## 2.5. 修改配置 +### 2.1 修改application.yml -### 2.5.1. 修改application.yml -    在配置application.yml文件中,必须要配置的有1、2、3配置项,其它配置可采用默认值,其中第1项中,需要配置一些部署IP和端口信息,第2项需要配置eureka的信息,第3项中只需要配置数据库的链接信息。**(visualis的库可以和dss同库,也可以不同,需部署用户自行抉择)**。 ```yaml -# ################################## -# 1. Visualis Service configuration -# ################################## server: protocol: http - address: 127.0.0.1 # server ip address(服务部署的机器IP) - port: 8008 # server port(visualis服务进程端口) - url: http://127.0.0.1:8088/dss/visualis # frontend index page full path(前端访问visualis的完整路径) - access: - address: 127.0.0.1 # frontend address(前端部署IP) - port: 8088 # frontend port(前端访问端口) - + address: #该服务所在的机器IP + port: #对应的服务端口 + url: #访问Visualis首页的完整http路径 + access: + address: #前端部署机器的IP或域名 + port: #前端部署的端口 -# ################################## -# 2. eureka configuration -# ################################## eureka: client: serviceUrl: - defaultZone: http://127.0.0.1:20303/eureka/ # Configuration required - instance: - metadata-map: - test: wedatasphere -management: - endpoints: - web: - exposure: - include: refresh,info + defaultZone: $EUREKA_URL #对应的 EUREKA地址 - -# ################################## -# 3. Spring configuration -# ################################## spring: - main: - allow-bean-definition-overriding: true application: - name: visualis-dev + name: visualis #模块名,用于做高可用(必须) + ## davinci datasouce config datasource: - url: jdbc:mysql://127.0.0.1:3306/visualis?characterEncoding=UTF-8&allowMultiQueries=true # Configuration required - username: hadoop - password: hadoop + url: #应用数据库的JDBC URL + username: #数据库用户名 + password: #数据库密码 -# 其它参数保持默认,如果不需要定制化修改,采用默认参数即可 +screenshot: + default_browser: PHANTOMJS # PHANTOMJS or CHROME + timeout_second: 1800 + phantomjs_path: ${DAVINCI3_HOME}/bin/phantomjs #selenium phantomjs Linux driver的路径(仅在default_browser选择PHANTOMJS的时候需要填写) + chromedriver_path: $your_chromedriver_path$ #selenium chrome Linux driver的路径(仅在default_browser选择CHROME的时候需要填写) ``` -### 2.5.2. 修改linkis.properties -```properties -# ################################## -# 1. need configuration -# 需要配置 -# ################################## -wds.linkis.gateway.url=http://127.0.0.1:9001 +### 2.2 修改linkis.properties + -# 其它可以使用默认参数 -# 省略配置 -``` -    **如果部署的hadoop集群开启了Kerberos,需要在visualis的配置文件linkis.properties文件中开启Kerberos,加入配置项:** ```properties -wds.linkis.keytab.enable=true + wds.dss.visualis.gateway.ip= #Linkis gateway的ip + wds.dss.visualis.gateway.port= #Linkis gateway的端口 ``` -## 3. 启动应用 - -    在配置和前端包编译完成后,可以尝试启动服务。Visualis目前和DSS集成,使用了DSS的登录及权限体系,使用前需部署完成DSS1.1.0版本,可以参考DSS1.1.0一键安装部署。 +## 3. 初始化数据库 +    在配置对应的数据库中执行安装包内的davinci.sql文件。 -### 3.1. 执行启动脚本 +## 4. 启动应用 -    进入Visualis的安装目录,找到bin文件夹,在此文件夹下执行一下命令。 -``` -sh ./start-server.sh -``` -备注:**如果启动服务时,报启动脚本的换行符无法识别,需要在服务器上对脚本进行编码转换使用:dos2unix xxx.sh 命令进行转换** +    修改完配置后,进入bin目录,进行应用的启动。 -### 3.2. 确认应用启动成功 +### 4.1 执行启动脚本 -    打开Eureka页面,在注册的服务列表中,找到visualis服务的实例,即可认为服务启动成功。同时也可以查看visualis的服务启动日志,如果没有报错,即服务顺利启动。 +    进入bin目录,执行 ``` -# 查看服务启动日志 -less logs/linkis.out + ./start-server.sh ``` -    查看Eureka页面,查看服务是否注册成功。 -![](./../images/visualis_eureka.png) +### 4.1 确认应用启动成功 -## 4. AppConn安装 -    Visualis服务部署后,需要和DSS应用商店和工作流打通,需要在DSS侧安装对应的AppConn,可参考[VisualisAppConn安装](./Visualis_appconn_install_cn.md)。 +    打开Eureka页面,在注册的服务列表中,找到Visualis,即可认为服务启动成功。如果3分钟内没有找到,可以到logs目录下的visualis.out中寻找错误信息。 -## 5. 有关域名访问DSS时Visualis的配置说明(可选) -    在实际生产中,访问DSS一般使用域名进行访问,读者阅读visualis安装部署文档和appconn的部署文档时会发现,visualis的配置中出现几处前端配置,这些前端配置影响预览功能和邮件报表功能。 -    如果使用域名时,需要注意以下配置: -1. AppConn安装时,指定visualis appconn的访问ip和端口时,可以先写入一个模拟值,待安装完成后,然后修改dss_appconn_instance表的url字段为域名值,类似: http://dss.bdp.com/ (注意后面的斜杠/,配置时不能遗漏)。 -2. Visualis服务的配置文件application.yml中,指定的前端ip和端口,需要指定为前端nginx服务器的ip和nginx配置的visualis端口。 +## 5. 部署前端页面 +    Visualis当前使用前后端分离的部署方案,需要下载前端的安装包后,解压到DSS的Nginx配置中/dws/visualis这个URL路径对应的服务器目录下。 -## 6. 日志配置(可选) -    在实际的使用场景中,依赖于linkis.out日志输出场景比较不符合规范,日志文件不回滚,长时间运行容易造成生产服务器磁盘容量告警,从而带来生产问题,目前我们可以通过修改日志配置,来优化日志打印,日志配置可以参考如下修改: -```properties - - - - - - - - - - - - - - - - - # 去掉该配置即会取消掉linkis.out日志输出。 - - - -``` diff --git a/visualis_docs/zh_CN/Visualis_display_dashboard_privew_cn.md b/visualis_docs/zh_CN/Visualis_display_dashboard_privew_cn.md deleted file mode 100644 index b961d6559..000000000 --- a/visualis_docs/zh_CN/Visualis_display_dashboard_privew_cn.md +++ /dev/null @@ -1,50 +0,0 @@ -> Visualis Display和Dashboard预览机制 - -## 1. 简介 -    Display和Dashboard的预览机制,提供了对将要发送的邮件进行预览的功能。在使用上,可以在开发完成Display和Dashboard后,点击组件上面工具栏中的预览按钮,游览器会新建一个tab并打开预览页面,当页面完全打开后,能看到最终的图片效果。如下图是Display开发完成后最终的预览效果,即最终的邮件报表发送的效果图。 -![预览结果](../images/preview_page.png) - -## 2. 设计原理 -    Visualis后端提供了预览接口,该接口使用场景分为两个,第一个是支持Visualis的前端预览功能,第二个是对接DSS工作流时,Display和Dashboard执行调用的接口。请求值主要是Display和Dashboard的主键ID,其返回值为图片的输出流。 -![预览总体流程](../images/preview.png) -    Display预览和Dashboard预览接口较为类似,Dashboard的预览接口可以查看源码中DashboardPreviewController类的previewPortal方法,只是Dashboard的预览存在多个面板页,并对图片进行了聚合,其它逻辑基本相同,Display的预览接口代码: -```java - @MethodLog - @GetMapping(value = "/{id}/preview", produces = MediaType.IMAGE_PNG_VALUE) - @ResponseBody - public void previewDisplay(@PathVariable Long id, - @RequestParam(required = false) String username, - @CurrentUser User user, - HttpServletRequest request, - HttpServletResponse response) throws IOException { - Display display = displayMapper.getById(id); - Project project = projectMapper.getById(display.getProjectId()); - - FileInputStream inputStream = null; - try { - List imageFiles = scheduleService.getPreviewImage(user.getId(), "display", id); - File imageFile = Iterables.getFirst(imageFiles, null).getImageFile(); - if(null != imageFile) { - inputStream = new FileInputStream(imageFile); - response.setContentType(MediaType.IMAGE_PNG_VALUE); - IOUtils.copy(inputStream, response.getOutputStream()); - } else { - log.error("Execute display failed, because image file is null."); - response.sendError(504, "Execute display failed, because image file is null."); - } - } catch (Exception e) { - log.error("display preview error: ", e); - } finally { - if(null != inputStream) { - inputStream.close(); - } - } - } -``` -    预览的核心是把Display的页面和Dashboard的页面截图,其主要功能依托于PhantomJS的实现,Visualis使用Java的Selenium库调用PhantomJS进行截图,其核心逻辑在ScreenshotUtil类中实现。截图需要依赖bin目录下的名称为phantomjs的二进制文件,这个是Selenium针对PhantomJS提供的Driver驱动,其相关包可以在Selenium官网上下载(**在打包的bin目录下,已经提供一个默认的PhantomJS驱动**)。 -    由于PhantomJS属于不维护状态,未来存在迁移到Chrome的可能性,同样可以在Selenium官网上下载到对应的driver,但使用Chrome需要在Linux机器上安装真正的Chrome浏览器,如需切换为Chromer需要进行适配测试和兼容性测试。 - -## 3. 预览优化 -    在实际的生产使用时,偶发场景会出现截图出现错执行误页面的情况,导致邮件发送时,会偶发出现报表为错误结果。这是使用场景上存在的一个生产问题,为了解决这个问题,我们引入失败标签监控机制,在前后端加入**WidgetExecuteFailedTag**前端标签元素,并由后端进行检测。 -![预览结果](../images/preview_bug_fix_1.png) - diff --git a/visualis_docs/zh_CN/Visualis_dss_integration_cn.md b/visualis_docs/zh_CN/Visualis_dss_integration_cn.md index 986658f07..f8e24701b 100644 --- a/visualis_docs/zh_CN/Visualis_dss_integration_cn.md +++ b/visualis_docs/zh_CN/Visualis_dss_integration_cn.md @@ -30,7 +30,16 @@ select * from students where class = ${className} ## 4. 如何使用邮件发送图表功能     DSS中的Display/Dashboard节点,在作为邮件发送内容时,会从Visualis系统获取图表对应的截图,为了确保截图功能正常,需检查以下几点: -1. 确保DSS安装目录有dss-appconn Sendemail AppConn目录。 +1. 确认linkis-appjoint-entrance中已经添加sendmail与visualis的appjoint。 1. 根据在application.yml中的配置的default_browser,确认已经将对应的selenium driver放在部署的服务器的目录下 1. 确认selenium driver已配置在application.yml的phantomjs_path或chromedriver_path中(默认为安装路径的bin目录下) -1. 确认启动Visualis的用户对selenium driver文件具有执行权限 \ No newline at end of file +1. 确认启动Visualis的用户对selenium driver文件具有执行权限 + + + + + + + + + diff --git a/visualis_docs/zh_CN/Visualis_linkisdatasource_cn.md b/visualis_docs/zh_CN/Visualis_linkisdatasource_cn.md deleted file mode 100644 index f4695e93d..000000000 --- a/visualis_docs/zh_CN/Visualis_linkisdatasource_cn.md +++ /dev/null @@ -1,216 +0,0 @@ -> Visualis 接入Linkis Datasource设计手册 - -## 1. 初衷 -    原始的Visualis,其必须要依赖一个数据源才能进行View和Wideget的开发,数据源需要配置相关的链接信息,Visualis才能通过配置的链接信息,查询相应的信息,提供View开发,但是传统的Visualis不支持大数据场景,或者支持的大数据场景较为简单(可以通过JDBC链接Hive ThriftServer),在微众银行企业内部,提供计算中间件Linkis链接支持多种大数据数据源,同时其提供多种企业级特性,为了更好的支持大数据场景,Visualis兼容原有的JDBC Source,并提供Linkis Datasource链接相关数据源,目前较为常用的是支持Hive Datasource,其名称为HiveDatasource,在建立新的View时,默认直接绑定为该数据源,在使用上,侧边栏会显示出其具有权限的库表信息,就像文件树类似,其库表可以鼠标双击展开。不局限于Hive Datasource,Visualis在代码层面支持数据源扩展,其中新增Presto数据源,更多的接入和扩展使用方法,需要用户自己去发现探索。 - -## 2. 设计思路 -    HiveDatasource在Visualis中存在一定的规范,为了使得每个用户登录使用时,能提供默认的配置的,一个标准的Hive数据源,在建立数据库时,需要提前在Source中插入一个模板。在Davinci.sql文件中,存在如下SQL: -```sql -DELETE FROM source; -INSERT INTO `source` ( - id, - name, - description, - config, - type, - project_id, - create_by, - create_time, - update_by, - update_time, - parent_id, - full_parent_id, - is_folder, - `index`) -VALUES ( - 1, - 'hiveDataSource', - '', - '{"parameters":"","password":"","url":"test","username":"hiveDataSource-token"}', - 'hive', - -1, - null,null,null,null,null,null,null,null); -``` -    其中默认插入主键为id为1,是为了规定好模板在数据库中的索引,使得接下来在使用时可以找到该模板位置。如果存在其它情况,该模板数据源在数据库中的索引变化,需要修改相关配置,并重启服务。有关数据源的相关配置,可以参考com.webank.wedatasphere.dss.visualis.utils.VisualisUtils类中的配置,使用时只需配置相应的键值对在linkis.properties文件中。和数据源有关的配置可以参考如下: -```scala - // hive datasource token值 - val HIVE_DATA_SOURCE_TOKEN = CommonVars("wds.dss.visualis.hive.datasource.token","hiveDataSource-token") - // hive datasource主键id - val HIVE_DATA_SOURCE_ID = CommonVars("wds.dss.visualis.hive.datasource.id",1) - // presto数据源token - val PRESTO_DATA_SOURCE_TOKEN = CommonVars("wds.dss.visualis.presto.datasource.token","prestoDataSource-token") - // presto数据源token - val PRESTO_DATA_SOURCE_ID = CommonVars("wds.dss.visualis.presto.datasource.id",210) -``` -    数据源创建时刻时发生在获取数据源信息时,在登录到Visualis,切换到Source的Tab时,前端接口会触发获取Source的列表接口。其Restful接口在SourceController类中,代码如下。 -```java - // 原始的Davinci接口 - @MethodLog - @GetMapping - public ResponseEntity getSources(@RequestParam Long projectId, - @CurrentUser User user, - HttpServletRequest request) { - if (invalidId(projectId)) { - ResultMap resultMap = new ResultMap(tokenUtils).failAndRefreshToken(request).message("Invalid project id"); - return ResponseEntity.status(resultMap.getCode()).body(resultMap); - } - List sources = sourceService.getSources(projectId, user, HttpUtils.getUserTicketId(request)); - return ResponseEntity.ok(new ResultMap(tokenUtils).successAndRefreshToken(request).payloads(sources)); - } -``` -    在SourceController类中,我们暂未修改其它Davinci的相关实现,为了兼容数据源复用逻辑,对SourceServive其中的接口进行了相关改造,在Service中,存在三步逻辑,分别是通过工程id获取该工程下对应的Source列表,Source列表进行遍历判断是否含有Hive数据源或是Presto数据源,最后一步如果没有相关的HiveDatasource数据源或是Presto数据源会进行插入,并加入到最终需要返回的列表中totalSource,其代码如下: -```java - @Override - public List getSources(Long projectId, User user, String ticketId) throws NotFoundException, UnAuthorizedExecption, ServerException { - ProjectDetail projectDetail = null; - try { - projectDetail = projectService.getProjectDetail(projectId, user, false); - } catch (NotFoundException e) { - throw e; - } catch (UnAuthorizedExecption e) { - return null; - } - - // 1.通过工程id获取相关的数据源 - List sources = sourceMapper.getByProject(projectId); - List totalSources = Lists.newArrayList(); - totalSources.addAll(hiveDBHelper.sourcesToHiveSources(sources)); - if (!CollectionUtils.isEmpty(totalSources)) { - ProjectPermission projectPermission = projectService.getProjectPermission(projectDetail, user); - if (projectPermission.getSourcePermission() == UserPermissionEnum.HIDDEN.getPermission()) { - sources = null; - } - } - - // 2. 对数据源的种类和存在性进行判别 - if(sources.stream().noneMatch(s -> VisualisUtils.isLinkisDataSource(s))){ - - // 3. 对数据源进行插入 - Source hiveSource = sourceMapper.getById(VisualisUtils.getHiveDataSourceId()); - hiveSource.setId(null); - hiveSource.setProjectId(projectId); - sourceMapper.insert(hiveSource); - totalSources.add(hiveDBHelper.sourceToHiveSource(hiveSource)); - } - if(getAvailableEngineTypes(user.username).contains(VisualisUtils.PRESTO().getValue()) && sources.stream().noneMatch( - s -> VisualisUtils.isPrestoDataSource(s))){ - - // 3. 对数据源进行插入 - Source prestoSource = sourceMapper.getById(VisualisUtils.getPrestoDataSourceId()); - prestoSource.setId(null); - prestoSource.setProjectId(projectId); - sourceMapper.insert(prestoSource); - totalSources.add(hiveDBHelper.sourceToHiveSource(prestoSource)); - } - return totalSources; - } -``` -    数据源在使用时,依赖了Linkis服务,Linkis提供了数据源获取接口,屏蔽掉了第三方组件获取Hive Metasource相关信息的难度,Linkis提供数据源接口,并返回其规范的库表信息格式,在这里,Visualis只需要定义好接口请求和解析格式即可,并能快速的集成大数据的使用场景。请求Linkis数据源时,需要有GateWay进行转发,并设置相应的cookie值,即linkis ticket id,请求返回的接口为JSON格式,在使用时,需要对JSON字符串进行解析,其相关代码核心如下: -```java -public class HttpUtils { - - // linkis gateway相关接口 - private static final String GATEWAY_URL = CommonConfig.GATEWAY_PROTOCOL().getValue() + - CommonConfig.GATEWAY_IP().getValue() + ":" + CommonConfig.GATEWAY_PORT().getValue(); - - // 请求db信息接口 - private static final String DATABASE_URL = GATEWAY_URL + CommonConfig.DB_URL_SUFFIX().getValue(); - - // 请求table信息接口 - private static final String TABLE_URL = GATEWAY_URL + CommonConfig.TABLE_URL_SUFFIX().getValue(); - - // 请求列信息接口 - private static final String COLUMN_URL = GATEWAY_URL + CommonConfig.COLUMN_URL_SUFFIX().getValue(); - - public static String getDbs(String ticketId) { - // ... - HttpGet httpGet = new HttpGet(DATABASE_URL); - BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); - cookie.setVersion(0); - cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); - cookie.setPath("/"); - cookie.setExpiryDate(new Date(System.currentTimeMillis() + 1000 * 60 * 60 * 24 * 30L)); - cookieStore.addCookie(cookie); - String hiveDBJson = null; - try { - CloseableHttpResponse response = httpClient.execute(httpGet); - hiveDBJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (IOException e) { - logger.error("通过HTTP方式获取Hive数据库信息失败, reason:", e); - } - return hiveDBJson; - } - - public static String getTables(String ticketId, String hiveDBName) { - // ... - String tableJson = null; - try { - URIBuilder uriBuilder = new URIBuilder(TABLE_URL); - uriBuilder.addParameter("database", hiveDBName); - CookieStore cookieStore = new BasicCookieStore(); - CloseableHttpClient httpClient = HttpClientBuilder.create().setDefaultCookieStore(cookieStore).build(); - HttpGet httpGet = new HttpGet(uriBuilder.build()); - BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); - cookie.setVersion(0); - cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); - cookie.setPath("/"); - cookieStore.addCookie(cookie); - CloseableHttpResponse response = httpClient.execute(httpGet); - tableJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (URISyntaxException e) { - logger.error("{} url 有问题", TABLE_URL, e); - } catch (IOException e) { - logger.error("获取hive数据库 {} 下面的表失败了", hiveDBName, e); - } - return tableJson; - } - - public static String getColumns(String dbName, String tableName, String ticketId) { - // ... - String columnJson = null; - try { - URIBuilder uriBuilder = new URIBuilder(COLUMN_URL); - uriBuilder.addParameter("database", dbName); - uriBuilder.addParameter("table", tableName); - CookieStore cookieStore = new BasicCookieStore(); - CloseableHttpClient httpClient = HttpClientBuilder.create().setDefaultCookieStore(cookieStore).build(); - HttpGet httpGet = new HttpGet(uriBuilder.build()); - BasicClientCookie cookie = new BasicClientCookie(CommonConfig.TICKET_ID_STRING().getValue(), ticketId); - cookie.setVersion(0); - cookie.setDomain(CommonConfig.GATEWAY_IP().getValue()); - cookie.setPath("/"); - cookieStore.addCookie(cookie); - CloseableHttpResponse response = httpClient.execute(httpGet); - columnJson = EntityUtils.toString(response.getEntity(), "UTF-8"); - } catch (final URISyntaxException e) { - logger.error("{} url 有问题", COLUMN_URL, e); - } catch (final IOException e) { - logger.error("获取hive数据库 {}.{} 字段信息失败 ", dbName, tableName, e); - } - return columnJson; - } -``` -    有关配置的Hive Datasorce使用场景,该数据源并不会提供真实的执行逻辑,Visualis的绑定逻辑为,Widget需要绑定一个View,View会绑定一个Source,Widget执行时,并不会从Souce中获取执行的库表信息。在非传统的Davinci逻辑里面,会存在一个View的查询记录SQL,在真实执行时,通过提交该SQL代码Widget渲染的逻辑,所以,Linkis数据源仅仅是提供一个可视化编辑时的工具组件,并不会影响真实的执行。 -    View中的核心字段如下: -```json -// view绑定的sql -select * from default.dwc_vsbi_students_demo - -// 其指标维度信息 -{ - "id":{"sqlType":"INT","visualType":"number","modelType":"value"}, - "name":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "sex":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "age":{"sqlType":"INT","visualType":"number","modelType":"value"}, - "class":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "lesson":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "city":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "teacher":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "score":{"sqlType":"DOUBLE","visualType":"number","modelType":"value"}, - "fee":{"sqlType":"DOUBLE","visualType":"number","modelType":"value"}, - "birthday":{"sqlType":"STRING","visualType":"string","modelType":"category"}, - "exam_date":{"sqlType":"STRING","visualType":"string","modelType":"category"} -} -``` -## 3. 其它 -    目前如果在Visualis自身使用时,Visualis支持Hive Datasource,来提供View查询时的工具组件,如果是通过DSS工作流进行开发,Widget绑定上游表时,其Widget的数据是从CS服务中获取的,并不涉及到具体的数据源,目前Visualis代码层面还集成了Presto数据源,支持更加快速的查询分析,如果需要提供更多数据源的支持,可以参考Presto和Hive数据源的相关实现。 diff --git a/visualis_docs/zh_CN/Visualis_sendemail_cn.md b/visualis_docs/zh_CN/Visualis_sendemail_cn.md deleted file mode 100644 index 66a4ab6f8..000000000 --- a/visualis_docs/zh_CN/Visualis_sendemail_cn.md +++ /dev/null @@ -1,98 +0,0 @@ -> Visualis 发送邮件设计 -## 1. 简介 -    邮件功能是DSS提供的数据输出功能,其在工作流中可以通过拖拽的方式进行使用。目前邮件节点支持发送Visualis的数据展示节点,即Display节点和Dashboard节点,目前邮件发送采用了图片发送的方式,及配置完成后,在邮箱会收到一张Display和Dashboard的预览效果的图片。在邮件发送中,DSS采用了Spring的邮件发送工具包JavaMailSenderImpl,相关实现在SpringJavaEmailSender类中。 - - -## 2. 邮件发送的实现过程 -    邮件发送是开发工作流报表的最后一步,在SendEmail节点中,通过链接发送项及绑定发送节点实现数据输出,其功能依赖于Linkis的CS服务。由于邮件节点属于一类AppConn,其也存在相关AppConn的实例,所以在配置邮件发送时,由于邮件需要,需要配置以下邮件配置,其中enhance_json为SendEmail的相关发送配置项,主要为邮件服务器的IP、端口、用户名、密码、协议。其相关配置可以参考如下SQL: -```sql -INSERT INTO dss_appconn_instance ( - appconn_id, - label, - url, - enhance_json, - homepage_url, - redirect_url -) VALUES ( - 7, - 'DEV', - 'sendemail', - '{"email.host":"smtp.163.com","email.port":"25","email.username":"xxx@163.com","email.password":"xxxxx","email.protocol":"smtp"}', - NULL, - NULL -); -``` - -    邮件发送的过程需要上下有节点相互配合,在SendEmail执行前,数据可视化节点执行时就已经把相关的发送结果准备完成,在DSS工作流侧,Display和Dashboard执行实际上是去请求preview接口,相关实现可以参考[Display Dashboard预览原理](),使用Linkis的DownloadAction来请求大的结果集(我们默认请求preview的图片是属于大的结果集)。下面是Display和Dashboard的在DSS AppConn执行的核心逻辑。 -![SendEmail](./../images/sendemail.png) -```scala - private ResponseRef executePreview(AsyncExecutionRequestRef ref, String previewUrl, String metaUrl) - throws ExternalOperationFailedException { -// 部分代码省略... -HttpResult metaResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperationMeta, metadataDownloadAction); - String metadata = StringUtils.chomp(IOUtils.toString(metadataDownloadAction.getInputStream(), - ServerConfiguration.BDP_SERVER_ENCODING().getValue())); // 获得metadataDownloadAction的输出流数据 - ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createPictureResultSetWriter(); - resultSetWriter.addMetaData(new LineMetaData(metadata)); // 写结果集到CS - resultSetWriter.addRecord(new LineRecord(response)); // 写结果集到CS - resultSetWriter.flush(); // 刷新流 - IOUtils.closeQuietly(resultSetWriter); // 关闭流 - ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); -// 部分代码省略... - } -``` - -    在可视化节点Dispaly和Dashboard执行预览后,其结果集会写入到Linkis的CS服务中,有了需要发送的结果,在SendEmail执行时,只需要从Linkis CS服务中获取相应的内容即可,邮件节点,大概有两块核心逻辑,第一,通过上线文,从工作流的上线文CS中获取各个节点的id,在代码中为NodeIDs数组,然后把该数据进行遍历获取到每个节点任务的id,在代码中为jobIds,相关核心代码如下: - -```scala - def getJobIds(refContext: ExecutionRequestRefContext): Array[Long] = { - val contextIDStr = ContextServiceUtils.getContextIDStrByMap(refContext.getRuntimeMap) - val nodeIDs = refContext.getRuntimeMap.get("content") match { - case string: String => JSONUtils.gson.fromJson(string, classOf[java.util.List[String]]) - case list: java.util.List[String] => list - } - if (null == nodeIDs || nodeIDs.length < 1){ - throw new EmailSendFailedException(80003 ,"empty result set is not allowed") - } - info(s"From cs to getJob ids $nodeIDs.") - val jobIds = nodeIDs.map(ContextServiceUtils.getNodeNameByNodeID(contextIDStr, _)).map{ nodeName => - val contextKey = new CommonContextKey - contextKey.setContextScope(ContextScope.PUBLIC) - contextKey.setContextType(ContextType.DATA) - contextKey.setKey(CSCommonUtils.NODE_PREFIX + nodeName + CSCommonUtils.JOB_ID) - LinkisJobDataServiceImpl.getInstance().getLinkisJobData(contextIDStr, SerializeHelper.serializeContextKey(contextKey)) - }.map(_.getJobID).toArray - if (null == jobIds || jobIds.length < 1){ - throw new EmailSendFailedException(80003 ,"empty result set is not allowed") - } - info(s"Job IDs is ${jobIds.toList}.") - jobIds - } -``` -    第二步,由于运行时,其job的任务id在cs服务中对应着运行的结果集路径,通过调用fetchLinkisJobResultSetPaths方法,可以得到任务执行的结果集路径,其结果集路路径,即在任务执行时存入到CS服务中的任务结果记录,获取得到相关结果集后,即可进行邮件发送,邮件发送属于DSS的核心功能之一,是DSS数据输出的功能,在这里对Visualis和DSS报表邮件交互的核心代码进行了描述,其它的相关逻辑需要参考DSS SendEmail代码的相关逻辑。 -```scala - override protected def generateEmailContent(requestRef: ExecutionRequestRef, email: AbstractEmail): Unit = email match { - case multiContentEmail: MultiContentEmail => - val runtimeMap = getRuntimeMap(requestRef) - val refContext = getExecutionRequestRefContext(requestRef) - runtimeMap.get("category") match { - case "node" => - val resultSetFactory = ResultSetFactory.getInstance - EmailCSHelper.getJobIds(refContext).foreach { jobId => - refContext.fetchLinkisJobResultSetPaths(jobId).foreach { fsPath => - val resultSet = resultSetFactory.getResultSetByPath(fsPath) - val emailContent = resultSet.resultSetType() match { - case ResultSetFactory.PICTURE_TYPE => new PictureEmailContent(fsPath) - case ResultSetFactory.HTML_TYPE => throw new EmailSendFailedException(80003 ,"html result set is not allowed")//new HtmlEmailContent(fsPath) - case ResultSetFactory.TABLE_TYPE => throw new EmailSendFailedException(80003 ,"table result set is not allowed")//new TableEmailContent(fsPath) - case ResultSetFactory.TEXT_TYPE => throw new EmailSendFailedException(80003 ,"text result set is not allowed")//new FileEmailContent(fsPath) - } - multiContentEmail.addEmailContent(emailContent) - } - } - case "file" => throw new EmailSendFailedException(80003 ,"file content is not allowed") //addContentEmail(c => new FileEmailContent(new FsPath(c))) - case "text" => throw new EmailSendFailedException(80003 ,"text content is not allowed")//addContentEmail(new TextEmailContent(_)) - case "link" => throw new EmailSendFailedException(80003 ,"link content is not allowed")//addContentEmail(new UrlEmailContent(_)) - } - } -``` \ No newline at end of file diff --git a/visualis_docs/zh_CN/Visualis_sql_databind_cn.md b/visualis_docs/zh_CN/Visualis_sql_databind_cn.md deleted file mode 100644 index 3b53e667d..000000000 --- a/visualis_docs/zh_CN/Visualis_sql_databind_cn.md +++ /dev/null @@ -1,55 +0,0 @@ -> 工作流-Widget节点绑定DSS结果集节点 - -## 1. 简介 - -    Visualis作为一个可视化报表系统,目前已与DSS工作流打通,可以通过拖拽的方式来新建Visualis节点,进行可视化开发。对于传统的Visualis使用方式,可视化组件Widget,需要创建一个类似视图的组件View来提供图形渲染的数据源。对于Widget而言,只要是结果集为结构化的数据集,都能作为Widget的数据源进行可视化图形开发。 - -## 2. 使用方式 -    如果需要在DSS中使用Visualis的节点,需要参考[Visualis AppConn安装部署文档](./Visualis_appconn_install_cn.md),目前DSS的数据节点,只要能产生结构化数据结果集的节点都能支持与Visualis的Widget节点绑定成功。Widget绑定DSS数据节点的说明可以参考下表: - -|节点名|任务类型|备注| -|-----|-----|-----| -|sql|Spark SQL任务|不支持多结果集| -|pyspark|Python Spark任务|见备注| -|hql|Hive SQL任务|不支持多结果集| - -    对于Sql节点和Hql节点,只要不是多结果集查询,在执行完成后,其产生的结果集会注册到linkis的cs服务中,并生成一个临时表,其Dataframe结果集会注册到服务中,作为临时表存储。**在使用pyspark节点作为上游表时需要注意**,在使用Spark的Python来实现数据查询及作为Widget的数据源时,需要产生一个Dataframe的结果集,并调用show方法,其中Widget会显示为一个df的维度信息,对于表格而言,其属于一个多维度的表格,即列为多个。 -```python -df = spark.sql("select * from default.demo") -show(df) -``` -    如下图为在DSS中使用绑定上游使用Visualis节点的方式。 -![Widget绑定上游表](./../images/widget_databind_sql.gif) - -## 3. 实现原理 -    在DSS工作流中,拖拽数据开发节点后,其执行后会产生一个类似于cs_tmp_sql_5643_rc1的临时表,当拖拽Widget节点绑定数据开发节点时,在DSS工作流Json中,会设置Widegt节点的Json配置bindViewKey为上游绑定数据开发节点NodeId,DSS后端,会通过该绑定的NodeId找到其CS缓存表,并在请求同步创建Visualis的Widget节点时,传递其CS表对于的CS ID,作为Widget节点所使用的数据源。 -    下图是在DSS拖拽创建Widget节点时,绑定上游SQL节点所生成的工作流参数JSON。其中bindViewKey即为上游SQL节点的Node ID。 -```json -{ - "title": "widget_2919", - "bindViewKey": "22418bea-caec-4129-93ba-ce1938274b1c", - "desc": "" -} -``` -    其创建过程可以如下图所示: -![绑定数据节点](../images/sql_databind.png) - -## 3.1. 与DSS对接实现细节 -    DSS支持Widget、Display和Dashboard节点,它们的CRUD与执行是与Visualis对接的,与DSS对接的实现细节逻辑为如下。 - -DSS侧需要实现Visualis的AppConn的相关逻辑为: -1. 实现规范中的ProjectCreationOperation,在DSS项目创建时调用。通过HTTP的方式调用Visualis的Controller中的Project创建接口。(根据DSS权限在Visualis原生首页展示Project列表的功能暂未实现) -2. 实现规范中Ref的CRUD相关的Operation接口,在DSS节点创建时调用,通过参数先判断具体需要创建的节点类型,再通过HTTP的方式调用具体的Visualis的Controller中的Widget、Display或Widget创建接口。 - * 其中Widget的创建,并非调用Controller的默认接口,而是专门定义了/widget/smartcreate接口在WidgetResultfulApi中,处理因接入CS和虚拟view改造后需要处理的一些额外的逻辑。另外,Widget本身会保存当前工作流的CSID作为查询上下文信息的依据,所以当CSID本身发生变化的场景,需要调用widget/setcontext接口,更新对应的Widget中记录的CSID,否则会发生找不到上游表的情况。 - * Dashboard在Visualis中实际为多层结构,表现为Dashboard Portal-Dashboard,节点对接的默认创建两层接口,即与节点同名的Dashboard Portal下,只存在一个与节点同名的Dashboard,所以在创建时,需要依次调用两个创建接口,并使它们名称相同,并且是相互关联的。 - * Display在Visualis中实际为两层结构,表现为Display和Display Slide,它们是一对一关系,所以在创建时,需要依次调用两个创建接口,并使它们是相互关联的。 - * 实现规范中的VisualisRefExecutionOperation接口,在DSS节点执行时,通过HTTP的方式调用Visualis中的相应接口获取结果。 - * Widget的执行,调用WidgetResultfulApi中的/visualis/widget/{id}/getdata接口来获取Widget的执行结果。该接口专门为了对接DSS而实现,通过解析Widget的config栏位,模拟前端拼接的查询相关的参数,调用后台查询接口,获取执行结果。Widget执行的结果集,就是它所提交的Spark SQL查询的结果集。 - * Display/Dashboard的查询,调用对应Controller的preview接口,获取对应的截图二进制文件,作为结果集的Record。而结果集的Metadata,是需要再次通过专门的接口来获取,接口定义在WidgetResultfulApi中,接口格式为/widget/{type}/{id}/metadata,其中Display的type是display,Dashboard的type是portal,要注意这里的id,Dashboard要传对应的Dashboard Portal的id。Metadata接口返回的内容是json接口,记录了该Display/Dashboard上所添加的所有Widget的名称与其字段、更新时间的对应关系。 - * 实现规范中的Ref的导入导出的Operation接口,在DSS节点进行导入导出操作时,通过HTTP的方式调用Visualis中的相应接口。 - * ProjectRestfulApi中的import/export接口即为Visualis对导入导出功能的实现。导出接口接收Project ID和对应的widget、display或dashboard的ID作为参数,将所有的信息导出为json结构,上传到BML后,返回resourceId和version。导入接口接收Project ID和BML的resourceId和version作为参数,从BML上下载json结构后,复原成具体的实体,并返回新老ID的对应关系。 - * 需要注意,在AppConn侧,导入成功之后,需要更新原本的JobContent中的id信息,并返回给工作流进行更新。 - -Visualis侧需要进行以下相关改造: -1. 接入SSO规范。由于Visualis前端与DSS共享用户态,因此只要实现后端接口互相调用时的SSO即可。需要实现VisualisUserInterceptor,用来操作HTTP session中的用户信息。实现VisualisSSOFilterInitializer,用来将DSS提供的SSO Filter加入Visualis的HTTP请求处理的链路中。实现ModifyHttpRequestWrapper,用来将DSS请求提供的cookie信息复制到visualis侧的cookie中。 -2. 前端改造。为了支持多环境统一的前端访问,前端页面通过URL捕捉参数env={env},将参数转换为route label,放入以该页面为起点的后续所有接口请求中,使得gateway能够根据label将请求转发到对应的dev等环境对应的Visualis后台实例中。 diff --git a/visualis_docs/zh_CN/Visualis_user_manul_cn.md b/visualis_docs/zh_CN/Visualis_user_manul_cn.md deleted file mode 100644 index af2e5e02b..000000000 --- a/visualis_docs/zh_CN/Visualis_user_manul_cn.md +++ /dev/null @@ -1,70 +0,0 @@ -# Visualis使用文档 -## 服务入口 -Visualis服务目前以DataSphereStudio的一个模块的方式提供,可以进入DSS首页,再按以下步骤进入服务。 -支持两种方式使用Visualis服务: -1. 进入工作空间,通过[常用功能-进入工作流开发]进入工作流界面: -a) 个人测试使用,可以直接新建工作流。 -b) 正式使用,建议先新建合作项目,给予相关人员编辑、查看权限后,进入合作项目,再新建工作流。 -c) 进入工作流界面,拖拽Display、Dashboard和Widget节点,保存后,双击节点跳转到对应的编辑页面。 -2. 进入工作空间,通过[常用功能-进入Visualis]进入,使用习惯与DWS中保持一致。注意:从这个入口创建的项目和display、dashboard,无法被工作流引用,因此只支持编辑View和Widget,如果需要进行以邮件发送为目的的Display和Dashboard的编辑,或者有项目协作的需求,请使用第1种访问方式。 -## 功能概览 -### 项目/工程模块 -可以通过两种方式创建项目: -1. 在DSS的工作流模块中,创建一个新的项目,Visualis会同步新建一个同名项目。 -2. 通过Visualis首页的原生功能创建项目: -### 基本功能模块 -1. View视图 -2. Widget组件 -3. Viz可视化 -## 数据源 -1. Hive数据源,无需手动添加,系统已默认加载。 -2. 可以手动添加其它JDBC数据源。 -## 视图 -### 添加视图 -1. 从左侧菜单栏访问视图列表,点击右上角新增按钮: -2. 点击左上角选择一个source,选择相应的数据源(如果是hive数据源,选择hiveDataSource即可)。在编辑框内编写SQL后,点击右下角的Execute接口预执行。 -3. 执行完毕后,可在下方的结果集页面预览执行结果。 -4. 在结果集页面,将tab切换至Model后,可以调整字段类型信息。(如果希望使用中文字段名,可以使用select as语句将字段转换为中文) -5. 完成编辑后,可在左上角为该视图命名后,点击右上角的保存按钮保存该视图。 -## 图表组件 -### 建立图表 -1. 从左侧菜单栏进入Widget组件列表,选择右上方的新增按钮。 -2. 进入图表编辑界面。该界面从左到右分别为:视图字段栏、图表配置栏和图表展示区。在左上角选择一个视图后,可以看到视图中的所有字段均在左侧的视图字段栏中被列出。 -3. 将字段栏中的字段拖拽进配置栏的指标和维度里,即可实现对图表的编辑和预览。其中,分类型的字段可以被拖入维度中,数值型的字段可以被拖入指标中。如果发现字段的数值型和分类型的划分有误,也可以将字段直接上下拖拽到另一个种类型的栏中,从而直接改变字段的类型。 -4. 其中,点击指标后,可以在下拉菜单中切换该指标的聚合方式(默认为sum)。 -5. 在展示区预览数据后,可以在配置区切换图表的展现类型。将鼠标悬浮到缩略图标上时,能够看到该图表类型需要满足的指标维度的数量,按照提示调整后,即可完成图表的编辑。 -6. 图表展示支持两种驱动模式,a)透视驱动:该模式下,将为每个指标单独生成一张图表;b)图表驱动:该模式下,所有数据均在同一张图表中展示,但数据必须在图表限制的范围内。可以在下图所示区域切换这两种模式。 -7. 完成图表编辑后,在左上角输入图表组件名称,点击右上角保存按钮。 -### 筛选与排序 -1. 支持将任意字段拖拽进配置栏中的筛选框,进行对图表结果的进一步筛选。 -2. 拖拽后,界面将出现一个选择窗口,展现当前字段的所有值。勾选需要的值点击保存。 -3. 如果有更复杂的筛选要求,可以切换到条件筛选,定制个性化的筛选。 -4. 支持点击维度或指标,进行排序的选择: -5. 注意,当对筛选排序选项进行配置后,必须点击右上角的保存按钮进行保存,否则退出后再次进入,将不会保留上次的配置。 -### 图表样式与调整 -1. 在一些图表类型中,支持针对每个维度的值使用不同的颜色进行区分显示。如下图所示,将字段拖入颜色框中,在弹出的窗口里,为每个值分配不同的颜色后保存。 -2. 可以通过样式栏,对字体、颜色、标签、坐标轴等元素的外观进行调整,每种图表的可选项均有不同。 -3. 样式的调整也需要通过右上角的保存按钮进行保存。 -## 可视化展示 -### 两种展示形式 -1. Visualis支持DashBoard和Display两种展示形式,可以在左侧菜单栏中选择可视化栏,进入选择界面。 -2. 其中,DashBoard的图表以更加有序统一的形式组织在屏幕上,并提供图表联动和全局筛选等高级功能。 -3. 而Display的编辑器拥有更高的自由度,支持背景颜色、图层顺序、自定义标签等常用的排版选项,方便定制出更加具有艺术个性的可视化大屏。 -### DashBoard -1. 点击添加DashBoard,输入名字后,点击保存,即可在列表中找到刚刚添加的DashBoard。 -2. 点击图标进入编辑界面。编辑界面中,支持创建多层的目录结构,以及往目录下添加子DashBoard,以便将DashBoard以不同的逻辑进行分类。 -3. 在子DashBoard中,点击右上角的添加图标,即可选择我们之前创建的Widget图表组件,添加到屏幕上来。 -4. 第二步中,支持对报表数据的刷新间隔进行配置,默认为手工刷新,可以调整为以秒为单位的自动刷新。 -5. 点击保存后,即可看到选中的图表已经被添加到屏幕上。此时,可以通过拖拽来调整图表的大小及位置。 -6. 在DashBoard编辑界面的操作都将自动保存,无需额外操作。 -### Display -1. 点击添加Display,输入名字后,点击保存,即可在列表中找到刚刚添加的Display。 -2. 点击图标进入编辑界面。在右边栏中,支持对整个Display进行一些基础的定制操作。 -3. 点击上方菜单栏的图表按钮,即可选择Widget组件进行添加。 -4. 点击上方菜单栏的小组件按钮,即可添加一些辅助性的小组件。例如文字标签、当前时间等。 -5. 通过拖拽,可以将图表和组件进行位置的调整和放大缩小等操作。如果感到画面太小,可以在画布的右下角调整显示的比例。 -6. 直接在画布中点击某个图表,或在右边图层中勾选某个图层(每张图表或小组件都构成一个单独的图层),即可对该图层进行独立的配置。 -7. 对于文字标签,可以在右边的配置栏中输入文字。 -### 分享与授权 -1. 支持将某个DashBoard、某个Display或某个Widget通过链接分享给第三方。注意:打开分享链接时,第三方必须已经登录DSS。 -2. 除了普通分享之外,Visualis也支持针对指定用户进行授权分享,只有被授权的用户登录后打开链接才能看到图表内容。 \ No newline at end of file diff --git a/visualis_docs/zh_CN/Visualis_visual_doc_cn.md b/visualis_docs/zh_CN/Visualis_visual_doc_cn.md deleted file mode 100644 index 8e3849f74..000000000 --- a/visualis_docs/zh_CN/Visualis_visual_doc_cn.md +++ /dev/null @@ -1,94 +0,0 @@ -> 虚拟View设计文档 - -该文档主要阐述Visualis系统的Widget页面在运行时接收一个或多个元数据信息(即json格式的拼凑View,包括字段名称、字段类型、原始查询语句、数据源信息)作为参数,并根据选择的参数View中提供的信息进行动态数据查询的架构调整方案。 - -调整涉及以下几个方面: -1. 参数View和Source结构。 -2. Widget页面改造。 -3. 查询逻辑改造。 -## 参数化View和Source结构 -1. 新增以下概念: -a) 虚拟view:一个不含具体内容的view,绑定该view的widget即作为接收view参数进行动态渲染的widget。 -b) 参数view:以json的形式包含字段名称、字段类型、原始查询语句、数据源等信息,作为url参数传递给widget编辑页面。 -c) 参数source:作为参数view的一个栏位,同样为json结构,指定该view在进行查询时应当提交的引擎类型(spark、hive、jdbc等)、数据源类型(hive库表、SQL脚本、linkis结果集等)、数据源具体内容、数据来源(creator,如scriptis)。 -d) 以上概念中,widget需要能够被绑定一个虚拟View,实现对拼凑view的参数接收,参数View。 -2. 虚拟view为在数据库中插入的一行,它的project_id和source_id为-1; sql、model、variable和config栏位均为null。 -3. 仅支持被其它系统调用时,并在传递参数的情况下新建绑定虚拟view的widget,无法在建立widget的时候手工选择绑定虚拟view。 -4. 参数view需要符合以下json格式。 -a) 数据源类型和数据具体内容的对应: - -|dataSourceType |dataSourceContent | -|---------------|------------------| -resultset|结果集路径 -script|BML resource id + version -table|库表名称 -context|context id,keyword -url|url -``` -{ - "name": "test_view1", - "model": { - "data_id": { - "sqlType": "STRING", - "visualType": "string", - "modelType": "value" - }, - "ds": { - "sqlType": "STRING", - "visualType": "string", - "modelType": "category" - } - }, - "source": { - "engineType": "spark", //引擎类型 - - "dataSourceType": "resultset", //数据源类型,结果集、脚本、库表 - "dataSourceContent": { - "resultLocation": "/tmp/linkis/resultset/_0.dolphin" - }, - "creator": "scriptis" - }, - "params": "[]" -} -``` -1. 传参方式 -a) 新建widget:/dss/visualis/#/project/3/widget/add?views=[{view1 json},{view2 json}] -b) 编辑widget:/dss/visualis/#/project/3/widget/4?views=[{view1 json},{view2 json}] -c) 考虑给出直接建立虚拟widget的post接口,然后打开widget就可以渲染 -## Widget页面改造 -1. 新增以下概念: -a) 虚拟widget:即为绑定虚拟view的widget,config中明确指明virtual=true。其config中增加source字段,存储参数source;其config的model字段,存储参数view中的字段信息。 -b) Context ID:作为widget节点被创建时,widget的config字段中增加contextId字段,存储改widget节点所在flow对应的context id。(所有widget节点对应的widget都是虚拟widget,创建的时候就设置virtual=true) -2. 打开新建widget界面时: -a) 如果无URL参数,则保持原有逻辑。 -b) 如果有URL参数,则认为是准备新建虚拟widget。如果参数只有一个,则选中这个参数view,如果有多个,则不选中,全部放进下拉列表。保存时,指明virtual=true。 -3. 打开编辑widget界面时: -a) 如果无URL参数,且该widget是虚拟widget: -i. 检查config中model非空,如果有,则查询直接提交到config中的参数source。 -ii. 如果config中model为空,contextId非空,则根据contextId找到上游的所有metadata(后端提供接口),作为下拉列表的备选view。选中context中的metadata并保存后,config中的source更新为对应的context类型,并记录相应的key。 -iii.如果什么都没有,保持原有的编辑空widget页面的逻辑不变,没有什么有意义的内容可供操作。 -b) 如果有URL参数,且该widget是虚拟widget:如果参数只有一个,则选中这个参数view,如果有多个,则不选中,全部放进下拉列表。 -c) 不管是否有URL参数,只要打开的是非虚拟的正常widget,都保持原有逻辑不变。 -4. getdata和share/data接口,在虚拟widget下,都要多传一个source参数。后端处理时: -a) 如果是虚拟widget,而且没有传source,会直接查询报错; -b) 如果是非虚拟的正常widget,就算传了source,也直接忽略。 -## 查询逻辑改造 -针对新增的各种数据源,之前的查询方式需要作出以下改造: -1. SourceInitializer:对数据源进行初始化,返回初始化后更加详细的source信息。 -a) 针对结果集数据源,在spark中生成或更新temp view,返回的source中补充select该temp view的sql。 -b) 针对SQL脚本数据源,从BML拉取对应的sql,放入返回的source中。 -c) 针对hive库表数据源,拼接select语句放入返回的source中。 -d) 针对CS数据源,根据context id拉取具体的元数据内容,拼接select语句放入返回的source中。 -e) 针对URL外部数据源,请求数据后,转换成dolphin格式,提交给spark建立temp view,返回的source中补充select该temp view的sql。 -i. URL数据源提供方直接给出dolphin格式的数据。 -ii. 可以考虑初步实现csv等常见格式到dolphin的转换。 -iii. 其它格式,后续按对接其它系统的需求再实现。 -2. QueryStatementGenerator:根据指标维度条件、数据源信息,生成相应的查询语句。 -a) 默认实现SQLQueryStatementGenerator,将指标维度转换成select语句的各个部分,from部分为source中包含的原始查询。 -b) 其它语言的后续按需求实现。 -3. QueryExecutor:将查询提交到相应的引擎,以及负责进度、状态和结果集的获取。 -a) 从外部系统跳转的source中,如果带有creator信息,则提交该creator的引擎;如果没有,默认提交给visualis引擎。 -b) 同步查询:直接查询,阻塞到结果集生成后,直接返回结果集。 -c) 异步查询:提交查询,返回查询id,提供进度、状态跟踪,最后通过id获取结果集。 -4. ResultParser:将引擎返回的原始结果集,转换为可以返回给前端渲染的结果集格式。 -a) DolphinToVisualisResultParser,转换dolphin格式的结果集到Visualis前端格式。 diff --git a/visualis_docs/zh_CN/visualis_design_cn.md b/visualis_docs/zh_CN/visualis_design_cn.md deleted file mode 100644 index 174798649..000000000 --- a/visualis_docs/zh_CN/visualis_design_cn.md +++ /dev/null @@ -1,56 +0,0 @@ -# 1. 功能特性 -    基于达芬奇项目, Visualis与DataSphere Studio结合,一同实现了以下特性: -- 图表水印 -- 数据质量校验 -- 图表展示优化 -- 对接Linkis计算中间件 -- Scriptis结果集一键可视化 -- 外部应用参数支持 -- Dashboard/Display集成为DataSphere Studio的工作流节点 -- Visualis同时支持以下Davinci的原生功能:数据源 -- 支持JDBC数据源 -- 支持CSV文件上传 -- 数据视图 -- 支持定义SQL模版 -- 支持SQL高亮显示 -- 支持SQL测试 -- 支持回写操作 -- 可视组件 -- 支持预定义图表 -- 支持控制器组件 -- 支持自由样式 -- 交互能力 -- 支持可视组件全屏显示 -- 支持可视组件本地控制器 -- 支持可视组件间过滤联动 -- 支持群控控制器可视组件 -- 支持可视组件本地高级过滤器 -- 支持大数据量展示分页和滑块 -- 集成能力 -- 支持可视组件CSV下载 -- 支持可视组件公共分享 -- 支持可视组件授权分享 -- 支持仪表板公共分享 -- 支持仪表板授权分享 -# 2. 与DSS集成 -    Visualis是一个数据可视化平台解决方案,面向业务人员、数据工程师、数据分析师,及数据相关岗位的,提供一站式数据可视化解决方案。 用户只需在可视化页面前端上,简单配置不同数据源,及可实现一套数据可视化应用,并支持多只数据模型展示,提供高级交互、行业分析、模式探索、社交智能等可视化功能。Visualis与DataSphere Studio的数据开发、工作流调度和数据质量校验等模块无缝衔接,实现数据应用开发全流程的连贯顺滑用户体验。 - - - -## 2.1. 应用商店集成 -    Visualis可以内嵌入DSS前端页面,接入到DSS的应用商店,可以免密互通。 -![](../images/visualis_dss_1.png) - - -## 2.2. 工作流集成 -    Visualis实现了DSS的二级和三级规范,接入DSS工程和编排(工作流),配置DSS的工作流节点,支持在DSS工作流中拖拽的方式使用Visualis。 -![](../images/visualis_dss_2.png) - - - -## 3. 架构设计 -    围绕 View(数据视图)与 Widget(可视化组件)两个核心概念设计,View 是数据的结构化形态,一切逻辑/权限/服务等相关都是从 View 展开(在DSS工作流Spark SQL节点作为虚拟View)。Widget 是数据的可视化形态,一切展示/交互/引导等都是从 Widget 展开。下图是Visualis的功能组件模块。 -![](./../../images/architecture.png) - - - diff --git a/visualis_docs/zh_CN/visualis_update_cn.md b/visualis_docs/zh_CN/visualis_update_cn.md deleted file mode 100644 index 604d5bc45..000000000 --- a/visualis_docs/zh_CN/visualis_update_cn.md +++ /dev/null @@ -1,59 +0,0 @@ -Visualis 1.0.0-rc1 upgrading to 1.0.0 using documentation - ---- - - - -## 1. The upgrade steps are mainly divided into: - -- Service stop -- Execute database upgrade script -- Replace the visualis deployment directory with a new version package -- Add and modify configuration files -- Service startup - -#### 1. Service stop - -Enter the deployment directory of Visualis, and execute the command under the directory to stop the services of Visualis: -```shell -cd ${VISUALIS_INSTALL_PATH} -sh bin/stop-visualis-server.sh -``` - -#### 2. Execute database upgrade SQL script - -After linking the visualis database, execute the following SQL: -```sql -alter table linkis_user rename to visualis_user; -``` - -#### 3. Replace the visualis deployment directory with a new version package - -- Back up the deployment directory of the old version of visualis. Take this directory as an example: -```shell -mv /appcom/Install/VisualisInstall/lib /appcom/Install/VisualisInstall/lib-bak -``` -- Refer to [visualis installation and deployment document](./visualis_deploy_doc_cn.md). After compiling and packaging, replace lib. - - - -#### 4. Modify configuration - -- Visualis1.0.0-rc1 version is compatible with cookies in order to be compatible with dss1.0.1 and linkis1.1.1. You need to delete the following parameters and use the linkers configured by default in the code: linkis_user_session_ticket_id_v1 value. - -```properties -#Delete the following configuration -wds. linkis. session. ticket. key=bdp-user-ticket-id -wds. dss. visualis. ticketid=bdp-user-ticket-id - -``` -- After the configuration modification is completed, you need to reinstall visualis appconn on the DSS side. To install visualis1.0.0 appconn, refer to [visualis appconn installation](./visualis_appconn_install_cn.md). - - - - -#### 5. Service startup -    Now you can start the new version of Visualis services. Execute the command to start the services: -```shell -sh bin/start-visualis-server. sh -``` \ No newline at end of file diff --git a/visualis_docs/zh_CN/visualis_use_doc_cn.md b/visualis_docs/zh_CN/visualis_use_doc_cn.md deleted file mode 100644 index 7fc04b25d..000000000 --- a/visualis_docs/zh_CN/visualis_use_doc_cn.md +++ /dev/null @@ -1,18 +0,0 @@ -> 使用文档 -## 1. 基本使用文档 -    Visualis是基于Davinci进行开发的数据化BI产品,其支持原有的[Davinci用户用法](https://edp963.github.io/davinci/),在此基础上,Visualis提供了更多额外的的功能点。主要为,结果集可视化、工作流使用、邮件使用。 - -## 2. 结果集可视化 -    Visualis支持对接DSS的交互式脚本分析,在脚本运行完成后,可以对脚本的结果集进行可视化分析,并且结果集会自动绑定到一个默认的Widget,支持简单的拖拽就能实现Widget的开发。 -![scriptis visualis](./../images/visualis_scriptis_visualis.gif) - -## 3. 工作流使用 -    Visualis对接了DSS工作流,在DSS侧创建工程时,会同步创建Visualis工程,在工作流中,拖拽Visualis节点,同步也会在该工程中创建对应的组件,在工作流中使用Widget时,Widget需要绑定一个上游表作为数据源,来开发可视化图形,相关实现原理可以参考[Widget节点绑定DSS结果集节点](./Visualis_sql_databind_cn.md),通过拖拽Widget和Display,Dashboard三个组件,连接成线,即可实现一个可视化报表。 -![visualis workflow](./../images/visualis_workflow.gif) -    目前Visualis1.0.0对接DSS1.1.0中新增了View节点,和Sql节点类似,但在和Widget节点联合使用时,选用非绑定,双击进入Widget中后,在View选择栏中进行选择保存。 - -## 4. 邮件使用 -    DSS提供数据输出节点,在部署安装DSS时,需要配置相关的邮件服务器配置,使用邮件前,需确保邮件服务器的可用性,在工作流中通过拖拽邮件节点,连线并依赖可视化节点,配置相关邮件选项,即可发送邮件,邮件发送的最终效果,可以通过Display和Dashboard的预览接口进行查看。 -![sendemail](./../images/dss_sendemail.gif) - -有些使用上的注意点,可以参考[Visualis接入DSS/Linkis注意点](./Visualis_dss_integration_cn.md)。 diff --git a/webapp/.gitignore b/webapp/.gitignore new file mode 100644 index 000000000..ad2028038 --- /dev/null +++ b/webapp/.gitignore @@ -0,0 +1,14 @@ +# Don't check auto-generated stuff into git +coverage +build +node_modules +stats.json + +# Cruft +.DS_Store +npm-debug.log +.idea +pom.xml +.vscode +.awcache +.history \ No newline at end of file diff --git a/webapp/app/app.tsx b/webapp/app/app.tsx index 703c35679..a1485499f 100644 --- a/webapp/app/app.tsx +++ b/webapp/app/app.tsx @@ -87,8 +87,6 @@ echarts.registerTheme('default', DEFAULT_ECHARTS_THEME) import configureStore from './store' import createRoutes from './routes' -import 'default-passive-events' - const initialState = {} const store = configureStore(initialState, hashHistory) const MOUNT_NODE = document.getElementById('app') diff --git a/webapp/app/assets/fonts/excel.svg b/webapp/app/assets/fonts/excel.svg deleted file mode 100644 index d02a5c2c1..000000000 --- a/webapp/app/assets/fonts/excel.svg +++ /dev/null @@ -1 +0,0 @@ - \ No newline at end of file diff --git a/webapp/app/assets/fonts/iconfont.css b/webapp/app/assets/fonts/iconfont.css index f796b21e9..44ba3c1bc 100644 --- a/webapp/app/assets/fonts/iconfont.css +++ b/webapp/app/assets/fonts/iconfont.css @@ -75,16 +75,6 @@ content: "\e6c6"; } -.icon-relation-graph:before { - content: "\e7be"; -} - -.icon-relation-excel:before { - position: relative; - top: 3px; - content: url('excel.svg'); -} - .icon-widget-gallery:before { content: "\e671"; } diff --git a/webapp/app/assets/js/geo.js b/webapp/app/assets/js/geo.js old mode 100644 new mode 100755 diff --git a/webapp/app/assets/json/slideSettings/slide.json b/webapp/app/assets/json/slideSettings/slide.json index 273986da5..150e7f9b2 100644 --- a/webapp/app/assets/json/slideSettings/slide.json +++ b/webapp/app/assets/json/slideSettings/slide.json @@ -24,22 +24,6 @@ "component": "colorPicker", "default": [255,255,255,50] }] - }, { - "name": "display", - "title": "展示", - "items": [{ - "name": "displayMode", - "title": "展示模式", - "component": "radio", - "values": [{ - "name": "交互模式", - "value": "dynamic" - }, { - "name": "静态模式", - "value": "static" - }], - "default": "static" - }] }, { "name": "scale", "title": "缩放", diff --git a/webapp/app/assets/override/antd.css b/webapp/app/assets/override/antd.css index 9fe32f909..79033b9ee 100644 --- a/webapp/app/assets/override/antd.css +++ b/webapp/app/assets/override/antd.css @@ -176,10 +176,6 @@ margin: 0 4px; } -.ant-table-thead > tr > th { - padding: 10px 15px !important; -} - /* 缩小默认行高(override variable) */ /* .ant-table-thead > tr > th, .ant-table-tbody > tr > td { diff --git a/webapp/app/components/PaginationWithoutTotal/PaginationWithoutTotal.less b/webapp/app/components/PaginationWithoutTotal/PaginationWithoutTotal.less index 9cf417955..bc3df80ea 100644 --- a/webapp/app/components/PaginationWithoutTotal/PaginationWithoutTotal.less +++ b/webapp/app/components/PaginationWithoutTotal/PaginationWithoutTotal.less @@ -2,7 +2,7 @@ margin: 16px 0; display: flex; flex-direction: row; - justify-content: flex-start; + justify-content: flex-end; button { margin-left: 8px; diff --git a/webapp/app/components/PaginationWithoutTotal/index.tsx b/webapp/app/components/PaginationWithoutTotal/index.tsx index 0770be0ae..0770446b3 100644 --- a/webapp/app/components/PaginationWithoutTotal/index.tsx +++ b/webapp/app/components/PaginationWithoutTotal/index.tsx @@ -71,9 +71,11 @@ export class PaginationWithoutTotal extends PureComponent { diff --git a/webapp/app/components/SharePanel/ShareForm.tsx b/webapp/app/components/SharePanel/ShareForm.tsx index 58c287336..e573811d4 100644 --- a/webapp/app/components/SharePanel/ShareForm.tsx +++ b/webapp/app/components/SharePanel/ShareForm.tsx @@ -1,6 +1,6 @@ import React, { createRef } from 'react' -import { Input, Row, Col, message} from 'antd' +import { Input, Row, Col} from 'antd' import config, { env } from 'app/globalConfig' // FIXME const apiHost = `${location.origin}${config[env].host}` @@ -20,7 +20,6 @@ export class ShareForm extends React.PureComponent { private handleInputSelect = () => { this.shareLinkInput.current.input.select() document.execCommand('copy') - message.success('复制成功') } public render () { diff --git a/webapp/app/containers/Dashboard/Grid.tsx b/webapp/app/containers/Dashboard/Grid.tsx index 1dcae1392..bab9ad530 100644 --- a/webapp/app/containers/Dashboard/Grid.tsx +++ b/webapp/app/containers/Dashboard/Grid.tsx @@ -92,7 +92,7 @@ import { makeSelectCurrentLinkages } from './selectors' import { ViewActions, ViewActionType } from 'containers/View/actions' -const { loadViewDataFromVizItem, loadViewExecuteQuery, loadViewGetProgress, loadViewGetResult, loadViewKillExecute, loadViewsDetail, loadSelectOptions } = ViewActions +const { loadViewDataFromVizItem, loadViewsDetail, loadSelectOptions } = ViewActions import { makeSelectWidgets } from 'containers/Widget/selectors' import { makeSelectViews, makeSelectFormedViews } from 'containers/View/selectors' import { makeSelectCurrentProject } from 'containers/Projects/selectors' @@ -224,35 +224,6 @@ interface IGridProps { requestParams: IDataRequestParams, statistic: any ) => void - onViewExecuteQuery: ( - renderType: RenderType, - dashboardItemId: number, - viewId: number, - requestParams: IDataRequestParams, - statistic: any, - resolve: (data) => void, - reject: (data) => void - ) => void - onViewGetProgress: ( - execId: string, - resolve: (data) => void, - reject: (data) => void - ) => void - onViewGetResult: ( - execId: string, - renderType: RenderType, - dashboardItemId: number, - viewId: number, - requestParams: IDataRequestParams, - statistic: any, - resolve: (data) => void, - reject: (data) => void - ) => void - onViewKillExecute: ( - execId: string, - resolve: (data) => void, - reject: (data) => void - ) => void onLoadViewsDetail: (viewIds: number[], resolve: () => void) => void onInitiateDownloadTask: (id: number, type: DownloadTypes, downloadParams?: IDataDownloadParams[], itemId?: number) => void onClearCurrentDashboard: () => any @@ -291,7 +262,6 @@ interface IGridStates { dashboardSharePanelAuthorized: boolean nextMenuTitle: string drillPathSettingVisible: boolean - executeQueryFailed: boolean } interface IDashboardItemForm extends AntdFormType { @@ -336,8 +306,7 @@ export class Grid extends React.Component { dashboardSharePanelAuthorized: false, - nextMenuTitle: '', - executeQueryFailed: false + nextMenuTitle: '' } } @@ -478,18 +447,7 @@ export class Grid extends React.Component { } } - private execIds = [] - - private deleteExecId = (execId) => { - const index = this.execIds.indexOf(execId); - if (index > -1) this.execIds.splice(index, 1) - } - public componentWillUnmount () { - this.timeout.forEach(item => clearTimeout(item)) - this.execIds.forEach((execId) => { - this.props.onViewKillExecute(execId, () => {}, () => {}) - }) statistic.setDurations({ end_time: statistic.getCurrentDateTime() }, (data) => { @@ -534,180 +492,15 @@ export class Grid extends React.Component { private calcItemTop = (y: number) => Math.round((GRID_ROW_HEIGHT + GRID_ITEM_MARGIN) * y) private getChartData = (renderType: RenderType, itemId: number, widgetId: number, queryConditions?: Partial) => { - const { - currentItemsInfo, - widgets, - onViewExecuteQuery - } = this.props - const widget = widgets.find((w) => w.id === widgetId) - const widgetConfig: IWidgetConfig = JSON.parse(widget.config) - const { cols, rows, metrics, secondaryMetrics, filters, color, label, size, xAxis, tip, orders, cache, expired, view, engine } = widgetConfig - const updatedCols = cols.map((col) => widgetDimensionMigrationRecorder(col)) - const updatedRows = rows.map((row) => widgetDimensionMigrationRecorder(row)) - const customOrders = updatedCols.concat(updatedRows) - .filter(({ sort }) => sort && sort.sortType === FieldSortTypes.Custom) - .map(({ name, sort }) => ({ name, list: sort[FieldSortTypes.Custom].sortList })) - - const cachedQueryConditions = currentItemsInfo[itemId].queryConditions - - let tempFilters - let linkageFilters - let globalFilters - let tempOrders - let variables - let linkageVariables - let globalVariables - let drillStatus - let pagination - let nativeQuery - - if (queryConditions) { - tempFilters = queryConditions.tempFilters !== void 0 ? queryConditions.tempFilters : cachedQueryConditions.tempFilters - linkageFilters = queryConditions.linkageFilters !== void 0 ? queryConditions.linkageFilters : cachedQueryConditions.linkageFilters - globalFilters = queryConditions.globalFilters !== void 0 ? queryConditions.globalFilters : cachedQueryConditions.globalFilters - tempOrders = queryConditions.orders !== void 0 ? queryConditions.orders : cachedQueryConditions.orders - variables = queryConditions.variables || cachedQueryConditions.variables - linkageVariables = queryConditions.linkageVariables || cachedQueryConditions.linkageVariables - globalVariables = queryConditions.globalVariables || cachedQueryConditions.globalVariables - drillStatus = queryConditions.drillStatus || void 0 - pagination = queryConditions.pagination || cachedQueryConditions.pagination - nativeQuery = queryConditions.nativeQuery !== void 0 ? queryConditions.nativeQuery : cachedQueryConditions.nativeQuery - } else { - tempFilters = cachedQueryConditions.tempFilters - linkageFilters = cachedQueryConditions.linkageFilters - globalFilters = cachedQueryConditions.globalFilters - tempOrders = cachedQueryConditions.orders - variables = cachedQueryConditions.variables - linkageVariables = cachedQueryConditions.linkageVariables - globalVariables = cachedQueryConditions.globalVariables - pagination = cachedQueryConditions.pagination - nativeQuery = cachedQueryConditions.nativeQuery - } - - let groups = cols.concat(rows).filter((g) => g.name !== '指标名称').map((g) => g.name) - let aggregators = metrics.map((m) => ({ - column: decodeMetricName(m.name), - func: m.agg - })) - - if (secondaryMetrics && secondaryMetrics.length) { - aggregators = aggregators.concat(secondaryMetrics.map((second) => ({ - column: decodeMetricName(second.name), - func: second.agg - }))) - } - - if (color) { - groups = groups.concat(color.items.map((c) => c.name)) - } - if (label) { - groups = groups.concat(label.items - .filter((l) => l.type === 'category') - .map((l) => l.name)) - aggregators = aggregators.concat(label.items - .filter((l) => l.type === 'value') - .map((l) => ({ - column: decodeMetricName(l.name), - func: l.agg - }))) - } - if (size) { - aggregators = aggregators.concat(size.items - .map((s) => ({ - column: decodeMetricName(s.name), - func: s.agg - }))) - } - if (xAxis) { - aggregators = aggregators.concat(xAxis.items - .map((x) => ({ - column: decodeMetricName(x.name), - func: x.agg - }))) - } - if (tip) { - aggregators = aggregators.concat(tip.items - .map((t) => ({ - column: decodeMetricName(t.name), - func: t.agg - }))) - } - - const requestParamsFilters = filters.reduce((a, b) => { - return a.concat(b.config.sqlModel) - }, []) - const requestParams = { - groups: drillStatus && drillStatus.groups ? drillStatus.groups : groups, - aggregators, - filters: drillStatus && drillStatus.filter ? drillStatus.filter.sqls : requestParamsFilters, - tempFilters, - linkageFilters, - globalFilters, - variables, - linkageVariables, - globalVariables, - orders, - cache, - expired, - flush: renderType === 'flush', - pagination, - nativeQuery, - customOrders - } - - if (typeof view === 'object' && Object.keys(view).length > 0) requestParams.view = view - - if (engine) requestParams.engineType = engine - - if (tempOrders) { - requestParams.orders = requestParams.orders.concat(tempOrders) - } - this.setState({executeQueryFailed: false}) - - onViewExecuteQuery(renderType, itemId, widget.viewId, requestParams, {...requestParams, widget}, (result) => { - const { execId } = result - this.execIds.push(execId) - this.executeQuery(execId, renderType, itemId, widget.viewId, requestParams, {...requestParams, widget}, this) - }, () => { - this.setState({executeQueryFailed: true}) - return message.error('查询失败!') - }) - } - - private timeout = [] - - private executeQuery(execId, renderType, itemId, viewId, requestParams, statistic, that) { - const { onViewGetProgress, onViewGetResult } = that.props - // 空数据的话,会不请求数据,execId为undefined,这时候不需要getProgress - if (execId) { - onViewGetProgress(execId, (result) => { - const { progress, status } = result - if (status === 'Failed') { - // 提示 查询失败(显示表格头,就和现在的暂无数据保持一致的交互,只是提示换成“查询失败”) - that.setState({executeQueryFailed: true}) - that.deleteExecId(execId) - return message.error('查询失败!') - } else if (status === 'Succeed' && progress === 1) { - // 查询成功,调用 结果集接口,status为success时,progress一定为1 - onViewGetResult(execId, renderType, itemId, viewId, requestParams, statistic, (result) => { - that.deleteExecId(execId) - }, () => { - that.setState({executeQueryFailed: true}) - that.deleteExecId(execId) - return message.error('查询失败!') - }) - } else { - // 说明还在运行中 - // 三秒后再请求一次进度查询接口 - const t = setTimeout(that.executeQuery, 3000, execId, renderType, itemId, viewId, requestParams, statistic, that) - that.timeout.push(t) - } - }, (err) => { - that.setState({executeQueryFailed: true}) - that.deleteExecId(execId) - return message.error('查询失败!') - }) - } + this.getData( + (renderType, itemId, widget, requestParams) => { + this.props.onLoadDataFromItem(renderType, itemId, widget.viewId, requestParams, {...requestParams, widget}) + }, + renderType, + itemId, + widgetId, + queryConditions + ) } // private downloadCsv = (itemId: number, widgetId: number) => { @@ -1106,7 +899,6 @@ export class Grid extends React.Component { }) const selectedWidgetsViewIds = widgets.filter((w) => selectedWidgets.includes(w.id)).map((w) => w.viewId) const viewIds = selectedWidgetsViewIds - .filter((viewId) => typeof viewId === 'number' && viewId > 0) .filter((viewId, idx) => selectedWidgetsViewIds.indexOf(viewId) === idx) .filter((viewId) => !formedViews[viewId]) @@ -1726,8 +1518,7 @@ export class Grid extends React.Component { globalFilterConfigVisible, allowFullScreen, dashboardSharePanelAuthorized, - drillPathSettingVisible, - executeQueryFailed + drillPathSettingVisible } = this.state let dashboardType: number if (currentDashboard) { @@ -1798,8 +1589,7 @@ export class Grid extends React.Component { const drillHistory = queryConditions.drillHistory const drillpathSetting = queryConditions.drillpathSetting const drillpathInstance = queryConditions.drillpathInstance - const config = widget ? JSON.parse(widget.config) : {} - const view = config.view ? config.view : formedViews[widget.viewId] + const view = formedViews[widget.viewId] const isTrigger = currentLinkages && currentLinkages.length ? currentLinkages.map((linkage) => linkage.trigger[0] ).some((tr) => tr === String(id)) : false @@ -1849,7 +1639,6 @@ export class Grid extends React.Component { monitoredSyncDataAction={this.props.onMonitoredSyncDataAction} monitoredSearchDataAction={this.props.onMonitoredSearchDataAction} ref={(f) => this[`dashboardItem${id}`] = f} - executeQueryFailed={executeQueryFailed} /> )) @@ -2080,12 +1869,8 @@ export function mapDispatchToProps (dispatch) { onEditDashboardItem: (portalId, item, resolve) => dispatch(editDashboardItem(portalId, item, resolve)), onEditDashboardItems: (portalId, items) => dispatch(editDashboardItems(portalId, items)), onDeleteDashboardItem: (id, resolve) => dispatch(deleteDashboardItem(id, resolve)), - onLoadDataFromItem: (renderType, itemId, viewId, requestParams, statistic) => dispatch(loadViewDataFromVizItem(renderType, itemId, viewId, requestParams, 'dashboard', statistic)), - onViewExecuteQuery: (renderType, itemId, viewId, requestParams, statistic, resolve, reject) => dispatch(loadViewExecuteQuery(renderType, itemId, viewId, requestParams, 'dashboard', statistic, resolve, reject)), - onViewGetProgress: (execId, resolve, reject) => dispatch(loadViewGetProgress(execId, resolve, reject)), - onViewGetResult: (execId, renderType, itemId, viewId, requestParams, statistic, resolve, reject) => dispatch(loadViewGetResult(execId, renderType, itemId, viewId, requestParams, 'dashboard', statistic, resolve, reject)), - onViewKillExecute: (execId, resolve, reject) => dispatch(loadViewKillExecute(execId, resolve, reject)), - + onLoadDataFromItem: (renderType, itemId, viewId, requestParams, statistic) => + dispatch(loadViewDataFromVizItem(renderType, itemId, viewId, requestParams, 'dashboard', statistic)), onLoadViewsDetail: (viewIds, resolve) => dispatch(loadViewsDetail(viewIds, resolve)), onClearCurrentDashboard: () => dispatch(clearCurrentDashboard()), onInitiateDownloadTask: (id, type, downloadParams?, itemId?) => dispatch(initiateDownloadTask(id, type, downloadParams, itemId)), diff --git a/webapp/app/containers/Dashboard/components/DashboardItem.tsx b/webapp/app/containers/Dashboard/components/DashboardItem.tsx index 72f4b4fd7..193eb12ea 100644 --- a/webapp/app/containers/Dashboard/components/DashboardItem.tsx +++ b/webapp/app/containers/Dashboard/components/DashboardItem.tsx @@ -89,7 +89,6 @@ interface IDashboardItemProps { onGetControlOptions: OnGetControlOptions monitoredSyncDataAction?: () => any monitoredSearchDataAction?: () => any - executeQueryFailed?: boolean } interface IDashboardItemStates { @@ -142,7 +141,7 @@ export class DashboardItem extends React.PureComponent ) - // excel类型接的visualis的data - const visualisData = { - viewId: this.props.widget.viewId, - requestParams: widgetProps.query - } return (
this.container = f}> @@ -988,9 +981,9 @@ export class DashboardItem extends React.PureComponent
- {/* + {!loading && } - */} + {widgetButton} @@ -1052,9 +1045,7 @@ export class DashboardItem extends React.PureComponent {dataDrillHistory}
diff --git a/webapp/app/containers/Dashboard/components/fullScreenPanel/FullScreenPanel.tsx b/webapp/app/containers/Dashboard/components/fullScreenPanel/FullScreenPanel.tsx index aa30ac5e0..6ace672a8 100644 --- a/webapp/app/containers/Dashboard/components/fullScreenPanel/FullScreenPanel.tsx +++ b/webapp/app/containers/Dashboard/components/fullScreenPanel/FullScreenPanel.tsx @@ -137,11 +137,6 @@ class FullScreenPanel extends React.PureComponent ) } diff --git a/webapp/app/containers/Dashboard/index.tsx b/webapp/app/containers/Dashboard/index.tsx index 165c18695..158e978c9 100644 --- a/webapp/app/containers/Dashboard/index.tsx +++ b/webapp/app/containers/Dashboard/index.tsx @@ -788,7 +788,6 @@ export class Dashboard extends React.Component - {/* dashboard编辑页下方左侧 */}
@@ -857,7 +856,6 @@ export class Dashboard extends React.ComponentLoading tree...... : '' }
- {/* dashboard编辑页下方右侧 */}
{ isGrid diff --git a/webapp/app/containers/Dashboard/reducer.ts b/webapp/app/containers/Dashboard/reducer.ts index 0817a2046..561f41e48 100644 --- a/webapp/app/containers/Dashboard/reducer.ts +++ b/webapp/app/containers/Dashboard/reducer.ts @@ -297,16 +297,7 @@ function dashboardReducer (state = initialState, action: ViewActionType | any) { errorMessage: '' } }) - case ViewActionTypes.VIEW_EXECUTE_QUERY: - return payload.vizType !== 'dashboard' ? state : state - .set('currentItemsInfo', { - ...itemsInfo, - [payload.itemId]: { - ...itemsInfo[payload.itemId], - loading: true, - errorMessage: '' - } - }) + case ViewActionTypes.LOAD_VIEW_DATA_FROM_VIZ_ITEM_SUCCESS: fieldGroupedSort(payload.result.resultList, payload.requestParams.customOrders) return payload.vizType !== 'dashboard' ? state : state.set('currentItemsInfo', { @@ -330,29 +321,6 @@ function dashboardReducer (state = initialState, action: ViewActionType | any) { } } }) - case ViewActionTypes.VIEW_GET_RESULT_SUCCESS: - fieldGroupedSort(payload.result.resultList, payload.requestParams.customOrders) - return payload.vizType !== 'dashboard' ? state : state.set('currentItemsInfo', { - ...itemsInfo, - [payload.itemId]: { - ...itemsInfo[payload.itemId], - loading: false, - datasource: payload.result, - selectedItems: [], - renderType: payload.renderType, - queryConditions: { - ...itemsInfo[payload.itemId].queryConditions, - tempFilters: payload.requestParams.tempFilters, - linkageFilters: payload.requestParams.linkageFilters, - globalFilters: payload.requestParams.globalFilters, - variables: payload.requestParams.variables, - linkageVariables: payload.requestParams.linkageVariables, - globalVariables: payload.requestParams.globalVariables, - pagination: payload.requestParams.pagination, - nativeQuery: payload.requestParams.nativeQuery - } - } - }) case GLOBAL_CONTROL_CHANGE: const controlRequestParamsByItem: IMapItemControlRequestParams = payload.controlRequestParamsByItem Object.entries(controlRequestParamsByItem) @@ -427,19 +395,6 @@ function dashboardReducer (state = initialState, action: ViewActionType | any) { }) : state : state - case ViewActionTypes.VIEW_GET_RESULT_FAILURE: - return payload.vizType === 'dashboard' - ? !!itemsInfo - ? state.set('currentItemsInfo', { - ...itemsInfo, - [payload.itemId]: { - ...itemsInfo[payload.itemId], - loading: false, - errorMessage: payload.errorMessage - } - }) - : state - : state case LOAD_DASHBOARD_SHARE_LINK: return state.set('currentDashboardShareInfoLoading', true) diff --git a/webapp/app/containers/Display/Editor.tsx b/webapp/app/containers/Display/Editor.tsx index e543ff7c6..bb82471ea 100644 --- a/webapp/app/containers/Display/Editor.tsx +++ b/webapp/app/containers/Display/Editor.tsx @@ -75,7 +75,7 @@ const styles = require('./Display.less') import { IWidgetConfig, RenderType } from '../Widget/components/Widget' import { decodeMetricName } from '../Widget/components/util' import { ViewActions } from '../View/actions' -const { loadViewDataFromVizItem, loadViewExecuteQuery, loadViewGetProgress, loadViewGetResult, loadViewKillExecute, loadViewsDetail } = ViewActions // @TODO global filter in Display +const { loadViewDataFromVizItem, loadViewsDetail } = ViewActions // @TODO global filter in Display import { makeSelectWidgets } from '../Widget/selectors' import { makeSelectFormedViews } from '../View/selectors' import { GRID_ITEM_MARGIN, DEFAULT_BASELINE_COLOR, DEFAULT_SPLITER, MAX_LAYER_COUNT } from 'app/globalConstants' @@ -148,38 +148,8 @@ interface IEditorProps extends RouteComponentProps<{}, IParams> { renderType: RenderType, layerItemId: number, viewId: number, - requestParams: IDataRequestParams, - statistic: any + requestParams: IDataRequestParams ) => void - onViewExecuteQuery: ( - renderType: RenderType, - dashboardItemId: number, - viewId: number, - requestParams: IDataRequestParams, - statistic: any, - resolve: (data) => void, - reject: (data) => void - ) => void - onViewGetProgress: ( - execId: string, - resolve: (data) => void, - reject: (data) => void - ) => void - onViewGetResult: ( - execId: string, - renderType: RenderType, - dashboardItemId: number, - viewId: number, - requestParams: IDataRequestParams, - statistic: any, - resolve: (data) => void, - reject: (data) => void - ) => void - onViewKillExecute: ( - execId: string, - resolve: (data) => void, - reject: (data) => void - ) => void onLoadViewsDetail: (viewIds: number[], resolve: () => void) => void onShowEditorBaselines: (baselines: IBaseline[]) => void @@ -198,8 +168,7 @@ interface IEditorStates { id: number setting: any param: ILayerParams | Partial - }, - executeQueryFailed: boolean + } } export class Editor extends React.Component { @@ -212,15 +181,14 @@ export class Editor extends React.Component { slideParams: {}, currentLocalLayers: [], zoomRatio: 1, - sliderValue: 100, + sliderValue: 20, scale: 1, settingInfo: { key: '', id: 0, setting: null, param: null - }, - executeQueryFailed: false + } } } @@ -232,26 +200,12 @@ export class Editor extends React.Component { onHideNavigator() } - private execIds = [] - - private deleteExecId = (execId) => { - const index = this.execIds.indexOf(execId); - if (index > -1) this.execIds.splice(index, 1) - } - public componentWillUnmount () { - this.timeout.forEach(item => clearTimeout(item)) - this.execIds.forEach((execId) => { - this.props.onViewKillExecute(execId, () => {}, () => {}) - }) this.props.onResetDisplayState() } public componentWillReceiveProps (nextProps: IEditorProps) { const { currentSlide, currentLayers } = nextProps - if (this.state.slideParams && this.state.slideParams.displayMode) { - this.setDisplayMode(this.state.slideParams.displayMode) - } let { slideParams, currentLocalLayers } = this.state if (currentSlide && currentSlide !== this.props.currentSlide) { @@ -330,13 +284,12 @@ export class Editor extends React.Component { const { currentLayersInfo, widgets, - // onLoadViewDataFromVizItem - onViewExecuteQuery + onLoadViewDataFromVizItem } = this.props const widget = widgets.find((w) => w.id === widgetId) const widgetConfig: IWidgetConfig = JSON.parse(widget.config) - const { cols, rows, metrics, secondaryMetrics, filters, color, label, size, xAxis, tip, orders, cache, expired, view, engine } = widgetConfig + const { cols, rows, metrics, secondaryMetrics, filters, color, label, size, xAxis, tip, orders, cache, expired } = widgetConfig const updatedCols = cols.map((col) => widgetDimensionMigrationRecorder(col)) const updatedRows = rows.map((row) => widgetDimensionMigrationRecorder(row)) const customOrders = updatedCols.concat(updatedRows) @@ -354,6 +307,7 @@ export class Editor extends React.Component { let globalVariables let pagination let nativeQuery + if (queryConditions) { tempFilters = queryConditions.tempFilters !== void 0 ? queryConditions.tempFilters : cachedQueryConditions.tempFilters linkageFilters = queryConditions.linkageFilters !== void 0 ? queryConditions.linkageFilters : cachedQueryConditions.linkageFilters @@ -431,7 +385,7 @@ export class Editor extends React.Component { }, []) const requestParams = { - groups: Array.from(new Set(groups)), + groups, aggregators, filters: requestParamsFilters, tempFilters, @@ -448,68 +402,17 @@ export class Editor extends React.Component { nativeQuery, customOrders } - if (typeof view === 'object' && Object.keys(view).length > 0) requestParams.view = view - - if (engine) requestParams.engineType = engine if (tempOrders) { requestParams.orders = requestParams.orders.concat(tempOrders) } - // onLoadViewDataFromVizItem( - // renderType, - // itemId, - // widget.viewId, - // requestParams - // ) - - // 本身这里源代码就少一个statistic参数,可能没啥用,先随便赋个{} - const statistic = {} - this.setState({executeQueryFailed: false}) - onViewExecuteQuery(renderType, itemId, widget.viewId, requestParams, statistic, (result) => { - const { execId } = result - this.execIds.push(execId) - this.executeQuery(execId, renderType, itemId, widget.viewId, requestParams, statistic, this) - }, () => { - this.setState({executeQueryFailed: true}) - return message.error('查询失败!') - }) - } - - private timeout = [] - - private executeQuery(execId, renderType, itemId, viewId, requestParams, statistic, that) { - const { onViewGetProgress, onViewGetResult } = that.props - // 空数据的话,会不请求数据,execId为undefined,这时候不需要getProgress - if (execId) { - onViewGetProgress(execId, (result) => { - const { progress, status } = result - if (status === 'Failed') { - // 提示 查询失败(显示表格头,就和现在的暂无数据保持一致的交互,只是提示换成“查询失败”) - that.setState({executeQueryFailed: true}) - that.deleteExecId(execId) - return message.error('查询失败!') - } else if (status === 'Succeed' && progress === 1) { - // 查询成功,调用 结果集接口,status为success时,progress一定为1 - onViewGetResult(execId, renderType, itemId, viewId, requestParams, statistic, (result) => { - that.deleteExecId(execId) - }, () => { - that.setState({executeQueryFailed: true}) - that.deleteExecId(execId) - return message.error('查询失败!') - }) - } else { - // 说明还在运行中 - // 三秒后再请求一次进度查询接口 - const t = setTimeout(that.executeQuery, 3000, execId, renderType, itemId, viewId, requestParams, statistic, that) - that.timeout.push(t) - } - }, () => { - that.setState({executeQueryFailed: true}) - that.deleteExecId(execId) - return message.error('查询失败!') - }) - } + onLoadViewDataFromVizItem( + renderType, + itemId, + widget.viewId, + requestParams + ) } private updateCurrentLocalLayers = ( @@ -603,54 +506,6 @@ export class Editor extends React.Component { this.props.toggleLayersResizingStatus(editLayers.map((l) => l.id), false) } - // 设置展示模式为静态模式或者是动态模式 - private setDisplayMode = (value) => { - const widgetDOMs = document.getElementsByClassName('widget-class') - const paginationDOMs = document.getElementsByClassName('ant-pagination') - const tableHeaderDOMs = document.getElementsByClassName('ant-table-header') - const tableBodyDOMs = document.getElementsByClassName('ant-table-body') - const tableWrapperDOMs = document.getElementsByClassName('ant-table-wrapper') - if (value === 'static') { - // 静态模式,隐藏掉所有滚动条和分页组件 - for (let i = 0; i < widgetDOMs.length; i++) { - widgetDOMs[i].style.overflow = 'hidden' - } - for (let i = 0; i < tableHeaderDOMs.length; i++) { - tableHeaderDOMs[i].style.setProperty('overflow', 'hidden', 'important') - } - for (let i = 0; i < tableBodyDOMs.length; i++) { - tableBodyDOMs[i].style.overflow = 'hidden' - } - for (let i = 0; i < tableWrapperDOMs.length; i++) { - tableWrapperDOMs[i].style.overflow = 'hidden' - } - for (let i = 0; i < paginationDOMs.length; i++) { - paginationDOMs[i].style.display = 'none' - } - } else { - // 动态模式 恢复原值 - for (let i = 0; i < widgetDOMs.length; i++) { - widgetDOMs[i].style.overflow = 'auto hidden' - } - for (let i = 0; i < tableHeaderDOMs.length; i++) { - tableHeaderDOMs[i].style.overflow = '' - tableHeaderDOMs[i].style.overflowX = 'hidden !important' - tableHeaderDOMs[i].style.overflowY = 'scroll !important' - } - for (let i = 0; i < tableBodyDOMs.length; i++) { - tableBodyDOMs[i].style.overflow = '' - tableBodyDOMs[i].style.overflowY = 'auto' - } - for (let i = 0; i < tableWrapperDOMs.length; i++) { - tableWrapperDOMs[i].style.overflowY = 'scroll' - tableWrapperDOMs[i].style.overflow = '' - } - for (let i = 0; i < paginationDOMs.length; i++) { - paginationDOMs[i].style.display = '' - } - } - } - private formItemChange = (field, val) => { const { slideParams, currentLocalLayers } = this.state @@ -742,8 +597,7 @@ export class Editor extends React.Component { }) }) if (viewIds && viewIds.length) { - const loadViewIds = viewIds.filter((viewId) => typeof viewId === 'number' && viewId > 0 && !formedViews[viewId]) - + const loadViewIds = viewIds.filter((viewId) => !formedViews[viewId]) if (loadViewIds.length) { onLoadViewsDetail(loadViewIds, () => { onAddDisplayLayers(currentDisplay.id, currentSlide.id, layers) @@ -953,23 +807,14 @@ export class Editor extends React.Component { zoomRatio, sliderValue, scale, - settingInfo, - executeQueryFailed + settingInfo } = this.state if (!currentDisplay) { return null } const layerItems = !Array.isArray(widgets) ? null : currentLocalLayers.map((layer, idx) => { const widget = widgets.find((w) => w.id === layer.widgetId) - let model = {} - if (widget) { - if (widget.viewId) { - model = widget && formedViews[widget.viewId].model - } else { - const config = JSON.parse(widget.config) - model = config.model - } - } + const model = widget && formedViews[widget.viewId].model const layerId = layer.id const { polling, frequency } = JSON.parse(layer.params) @@ -1004,7 +849,6 @@ export class Editor extends React.Component { onResizeLayerStop={this.resizeLayerStop} onDragLayerStop={this.dragLayerStop} onEditWidget={this.toWorkbench} - executeQueryFailed={executeQueryFailed} /> // ) @@ -1027,7 +871,6 @@ export class Editor extends React.Component { settingContent = ( // 最右侧的 设置栏 { dispatch(DisplayActions.editCurrentDisplay(display, resolve)), onEditCurrentSlide: (displayId, slide, resolve?) => dispatch(DisplayActions.editCurrentSlide(displayId, slide, resolve)), onUploadCurrentSlideCover: (cover, resolve) => dispatch(DisplayActions.uploadCurrentSlideCover(cover, resolve)), - onLoadViewDataFromVizItem: (renderType, itemId, viewId, requestParams, statistic) => dispatch(loadViewDataFromVizItem(renderType, itemId, viewId, requestParams, 'display', statistic)), - onViewExecuteQuery: (renderType, itemId, viewId, requestParams, statistic, resolve, reject) => dispatch(loadViewExecuteQuery(renderType, itemId, viewId, requestParams, 'display', statistic, resolve, reject)), - onViewGetProgress: (execId, resolve, reject) => dispatch(loadViewGetProgress(execId, resolve, reject)), - onViewGetResult: (execId, renderType, itemId, viewId, requestParams, statistic, resolve, reject) => dispatch(loadViewGetResult(execId, renderType, itemId, viewId, requestParams, 'display', statistic, resolve, reject)), - onViewKillExecute: (execId, resolve, reject) => dispatch(loadViewKillExecute(execId, resolve, reject)), - + onLoadViewDataFromVizItem: (renderType, itemId, viewId, requestParams) => dispatch(loadViewDataFromVizItem(renderType, itemId, viewId, requestParams, 'display')), onLoadViewsDetail: (viewIds, resolve) => dispatch(loadViewsDetail(viewIds, resolve)), onSelectLayer: ({ id, selected, exclusive }) => dispatch(DisplayActions.selectLayer({ id, selected, exclusive })), onClearLayersSelection: () => dispatch(DisplayActions.clearLayersSelection()), diff --git a/webapp/app/containers/Display/Preview.tsx b/webapp/app/containers/Display/Preview.tsx index 1af518027..c41ae2109 100644 --- a/webapp/app/containers/Display/Preview.tsx +++ b/webapp/app/containers/Display/Preview.tsx @@ -231,9 +231,6 @@ export class Preview extends React.Component { statistic.resetClock() } -// 备注:preview接口由前端发起的说明是开发中心所以加入labelsRoute=dev -// 如果是Display或者DashBoard在工作流执行时,发起的执行,是访问后台服务接口, -// 在请求时,由相关接口可以就可以带上环境标签。所以前端侧固定为Dev环境。 public render () { const {spinning} = this.state; const { @@ -242,9 +239,9 @@ export class Preview extends React.Component { const dashboardId = +params.dashboardId let host = `${config[env].host}` if (displayId) { - host += `/displays/${displayId}/preview?labelsRoute=dev` + host += `/displays/${displayId}/preview` } else { - host += `/dashboard/${dashboardId}/preview?labelsRoute=dev` + host += `/dashboard/${dashboardId}/preview` } return (
@@ -300,4 +297,4 @@ export default compose( withSaga, withSagaWidget, withSagaView, - withConnect)(Preview) + withConnect)(Preview) \ No newline at end of file diff --git a/webapp/app/containers/Display/components/DisplayContainer.tsx b/webapp/app/containers/Display/components/DisplayContainer.tsx index c96eb77ee..03e1b5ff5 100644 --- a/webapp/app/containers/Display/components/DisplayContainer.tsx +++ b/webapp/app/containers/Display/components/DisplayContainer.tsx @@ -38,7 +38,6 @@ export enum Keys { interface IDisplayContainerProps { slideParams: any zoomRatio: number - sliderValue: number children: JSX.Element[], onScaleChange: (scale: number) => void onCoverCutCreated: (blob: Blob) => void @@ -70,27 +69,39 @@ export class DisplayContainer extends React.Component void, sliderValue?: number) => { + private containerResize = () => { + const { zoomRatio, slideParams, onScaleChange } = this.props + this.updateStyle(zoomRatio, slideParams, onScaleChange) + } + + private updateStyle = (zoomRatio: number, slideParams: any, onScaleChange: (scale: number) => void) => { const { clientWidth, clientHeight } = this.container.current const [containerWidth, containerHeight] = [clientWidth, clientHeight].map((item) => Math.max(zoomRatio, 1) * item) const { width: slideWidth, height: slideHeight } = slideParams - let scale = sliderValue ? sliderValue / 100 : 1 + let scale = (slideWidth / slideHeight > containerWidth / containerHeight) ? + // landscape + (containerWidth - 64) / slideWidth * zoomRatio : + // portrait + (containerHeight - 64) / slideHeight * zoomRatio + scale = +(Math.floor(scale / 0.05) * 0.05).toFixed(2) const translateX = (Math.max(clientWidth - slideWidth * scale, 64)) / (2 * slideWidth) * 100 const translateY = (Math.max(clientHeight - slideHeight * scale, 64)) / (2 * slideHeight) * 100 const translate = `translate(${translateX}%, ${translateY}%)` @@ -164,8 +175,7 @@ export class DisplayContainer extends React.Component { - this.addSecondaryGraph(SecondaryGraphTypes.Label, item.name, item.name)() - }) } - private addSecondaryGraph = (secondaryGraphType: SecondaryGraphTypes, name?: string, text?: string) => () => { + private addSecondaryGraph = (secondaryGraphType: SecondaryGraphTypes) => () => { if (this.props.layers && this.props.layers.length >= MAX_LAYER_COUNT) return message.warning(`当前最多只支持添加${MAX_LAYER_COUNT}个图层!`, 5); const title = (slideSettings[secondaryGraphType] as any).title - const params = this.getDefaultSetting(GraphTypes.Secondary, secondaryGraphType) - if (text) params.contentText = text this.props.onAddLayers([{ - name: name ? name : `${title}_${uuid(5)}`, + name: `${title}_${uuid(5)}`, type: GraphTypes.Secondary, subType: secondaryGraphType, - params: JSON.stringify(params), - text: text ? text : '' + params: JSON.stringify(this.getDefaultSetting(GraphTypes.Secondary, secondaryGraphType)) }]) } diff --git a/webapp/app/containers/Display/components/DisplayList.tsx b/webapp/app/containers/Display/components/DisplayList.tsx index 2f90621d8..e8a414a8f 100644 --- a/webapp/app/containers/Display/components/DisplayList.tsx +++ b/webapp/app/containers/Display/components/DisplayList.tsx @@ -200,8 +200,11 @@ export class DisplayList extends React.PureComponent void onResizeLayerStop?: (itemId: number, deltaSize: IDeltaSize) => void onEditWidget?: (itemId: number, widgetId: number) => void - executeQueryFailed: boolean } interface ILayerItemStates { @@ -119,17 +118,17 @@ export class LayerItem extends React.PureComponent { e.stopPropagation() - // prevWidth和prevHeight是这个layer拖拽前的长宽,不是widget的 const { width: prevWidth, height: prevHeight } = this.state.layerParams - // size里的width和height是这个layer拖拽后的长宽 const { width, height } = size - // 需要根据前后的值算出变动的长宽 const delta = { deltaWidth: width - prevWidth, deltaHeight: height - prevHeight @@ -339,8 +334,7 @@ export class LayerItem extends React.PureComponent this.refLayer = f} @@ -386,8 +375,6 @@ export class LayerItem extends React.PureComponent) )}
@@ -505,7 +492,6 @@ export class LayerItem extends React.PureComponent (e) => { - const { onEditDisplayLayers, layers } = this.props - if (e && e.target && typeof e.target.value === 'string') { - if (e.target.value === '') { - // 输入框里的值变回原值 - e.target.value = layer.name - return message.error('更改失败,标签名不能为空!') - } else if (e.target.value.includes(' ')) { - // 输入框里的值变回原值 - e.target.value = layer.name - return message.error('更改失败,标签名不能包含空格!') - } else { - // 合理的标签名,更新其值 - for (let i = 0; i < layers.length; i++) { - if (layer.id === layers[i].id) { - layers[i].name = e.target.value - onEditDisplayLayers([layers[i]]) - return message.success('更改成功!') - } - } - } - } - } - public render () { const { layers, @@ -223,27 +198,15 @@ export class LayerList extends React.Component )) const layerItems = this.getLayersByIndexDesc(layers) - .map((layer) => - { - const name = layer.name - - return ( -
  • - - - { - layer.subType === 21 ? - // 说明是标签,需要可以编辑 - : - {name} - } - -
  • - ) - }) + .map((layer) => ( +
  • + + {layer.name} +
  • + )) return (

    diff --git a/webapp/app/containers/Display/components/SettingForm.tsx b/webapp/app/containers/Display/components/SettingForm.tsx index f48dc7884..f40d8a94b 100644 --- a/webapp/app/containers/Display/components/SettingForm.tsx +++ b/webapp/app/containers/Display/components/SettingForm.tsx @@ -23,7 +23,6 @@ import debounce from 'lodash/debounce' import api from 'utils/api' import { Form, Row, Col, Input, InputNumber, Radio, Checkbox, Select, Upload, Icon, Popover, Tooltip } from 'antd' -const { TextArea } = Input const FormItem = Form.Item const RadioGroup = Radio.Group const CheckboxGroup = Checkbox.Group @@ -35,7 +34,6 @@ import { SketchPicker } from 'react-color' const styles = require('../Display.less') interface ISettingFormProps extends FormComponentProps { - currentLocalLayers: any[] id: number settingInfo: any settingParams: any @@ -257,27 +255,6 @@ export class SettingForm extends React.Component { - const { currentLocalLayers, id } = this.props - let layer = {} - for (let i = 0; i < currentLocalLayers.length; i++) { - if (id === currentLocalLayers[i].id) { - layer = currentLocalLayers[i] - break - } - } - let text = '' - if (layer && layer.params) text = JSON.parse(layer.params).contentText - if (item.title === '文本内容') { - // 说明此时是正在编辑 标签 组件的文本内容,需要可以换行的输入框 - return ( -