前置知识:
Mybatis
Spring
SpringMVC
SpringBoot
微服务
Dubbo
后台信息管理系统采用的是前后端分离开发模式,前端使用LayUI系统作为模板进行改造,后端采用的是SpringBoot+Dubbo+SSM的架构进行开发
后台系统架构 工具:图像存储——COS对象存储 分析服务之间关系并创建工程 代码优化: 后台系统 后台系统架构 后台服务采用微服务的思想,使用Dubbo作为服务治理框架
微服务架构 在单体架构的应用中,一个服务的所有功能模块都会被部署到同一台机器上。当用户规模变得庞大,应用的响应速度会变慢。这个时候可以通过增加一台机器部署应用来提高响应速度。假设这个应用只有房源模块和用户模块。经过分析,这个系统中房源模块消耗的资源最多。提高响应速度最高效的方法就是单独为房源模块分配更多资源。单体架构的应用无法做到这一点。
在微服务架构的应用中,应用会按照功能模块拆分为多个服务。部署时,可以在一台机器上部署一个用户服务和一个房源服务,另一台机器上部署两个房源服务,这样可以最大化利用系统资源。
在该项目中为什么引入Dubbo Dubbo框架基于生产者-消费者模型来完成服务治理。在一项业务流程中,一个服务要么是服务提供方,要么是服务消费方。降低了各层之间服务的耦合度。
设想一种场景,前台使用者进入程序会根据地理位置、平时浏览的户型、设置的喜好推送房源列表,后台管理员登录管理系统后,也会有获取房源列表的需求。但这两种角色身份不同,获取到的房源列表也会不同,所以需要不同的处理器对模型层返回的房源列表进行处理。但这两种处理器中,从模型层获取房源列表这个过程是公有的,所以将模型层查询房源列表服务作为一个服务,提供给不同控制器调用。
这与Dubbo的生产者-消费者模式很类似,Service层处理完数据后提供给Controller层,Controller层利用Service层的数据进行不同的业务处理流程,所以我把Service层作为服务的生产者,Controller层作为服务的消费者。
项目结构
父工程 haoke-manage
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <modelVersion > 4.0.0</modelVersion > <groupId > com.haoke.manage</groupId > <artifactId > haoke-manage</artifactId > <packaging > pom</packaging > <version > 1.0-SNAPSHOT</version > <modules > <module > haoke-manage-dubbo-server</module > <module > haoke-manage-api-server</module > </modules > <parent > <artifactId > spring-boot-starter-parent</artifactId > <groupId > org.springframework.boot</groupId > <version > 2.4.3</version > </parent > <dependencies > <dependency > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-starter-test</artifactId > <version > 2.4.3</version > </dependency > <dependency > <groupId > org.apache.commons</groupId > <artifactId > commons-lang3</artifactId > </dependency > <dependency > <groupId > com.alibaba.boot</groupId > <artifactId > dubbo-spring-boot-starter</artifactId > <version > 0.2.0</version > </dependency > <dependency > <groupId > com.alibaba</groupId > <artifactId > dubbo</artifactId > <version > 2.6.4</version > </dependency > <dependency > <groupId > org.apache.zookeeper</groupId > <artifactId > zookeeper</artifactId > <version > 3.4.13</version > </dependency > <dependency > <groupId > com.github.sgroschupf</groupId > <artifactId > zkclient</artifactId > <version > 0.1</version > </dependency > </dependencies > <build > <plugins > <plugin > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-maven-plugin</artifactId > </plugin > </plugins > </build > </project >
子工程 服务生产方 haoke-manage-dubbo-server
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <parent > <artifactId > haoke-manage</artifactId > <groupId > com.haoke.manage</groupId > <version > 1.0-SNAPSHOT</version > </parent > <modelVersion > 4.0.0</modelVersion > <packaging > pom</packaging > <modules > <module > haoke-manage-dubbo-server-house-resources</module > <module > haoke-manage-dubbo-server-generator</module > </modules > <artifactId > haoke-manage-dubbo-server</artifactId > <dependencies > <dependency > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-starter</artifactId > </dependency > <dependency > <groupId > org.projectlombok</groupId > <artifactId > lombok</artifactId > </dependency > </dependencies > </project >
服务消费方 haoke-manage-api-server
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <parent > <artifactId > haoke-manage</artifactId > <groupId > com.haoke.manage</groupId > <version > 1.0-SNAPSHOT</version > </parent > <modelVersion > 4.0.0</modelVersion > <artifactId > haoke-manage-api-server</artifactId > <dependencies > <dependency > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-starter-web</artifactId > </dependency > <dependency > <groupId > com.haoke.manage</groupId > <artifactId > haoke-manage-dubbo-server-house-resources-interface</artifactId > <version > 1.0-SNAPSHOT</version > </dependency > </dependencies > </project >
房源服务的构建 创建数据表 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 use haoke; DROP TABLE IF EXISTS `TB_ESTATE`;CREATE TABLE `TB_ESTATE` ( `id` bigint NOT NULL AUTO_INCREMENT, `name` varchar (100 ) DEFAULT NULL COMMENT '楼盘名称' , `province` varchar (10 ) DEFAULT NULL COMMENT '所在省' , `city` varchar (10 ) DEFAULT NULL COMMENT '所在市' , `area` varchar (10 ) DEFAULT NULL COMMENT '所在区' , `address` varchar (100 ) DEFAULT NULL COMMENT '具体地址' , `year ` varchar (10 ) DEFAULT NULL COMMENT '建筑年代' , `type` varchar (10 ) DEFAULT NULL COMMENT '建筑类型' , `property_cost` varchar (10 ) DEFAULT NULL COMMENT '物业费' , `property_company` varchar (20 ) DEFAULT NULL COMMENT '物业公司' , `developers` varchar (20 ) DEFAULT NULL COMMENT '开发商' , `created` datetime DEFAULT NULL COMMENT '创建时间' , `updated` datetime DEFAULT NULL COMMENT '更新时间' , PRIMARY KEY (`id`) ) ENGINE= InnoDB AUTO_INCREMENT= 1006 DEFAULT CHARSET= utf8 COMMENT= '楼盘表' ; INSERT INTO `TB_ESTATE` VALUES (1001 ,'中远两湾城' ,'上海市' ,'上海市' ,'普陀区' ,'远景路97弄' ,'2001' ,'塔楼/板楼' ,'1.5' ,'上海中远物业管理发展有限公司' ,'上海万业企业股份有限公司' ,'2021-03-16 23:00:20' ,'2021-03-16 23:00:20' ), (1002 ,'上海康城' ,'上海市' ,'上海市' ,'闵行区' ,'莘松路958弄' ,'2001' ,'塔楼/板楼' ,'1.5' ,'盛孚物业' ,'闵行房地产' ,'2021-03-16 23:00:20' ,'2021-03-16 23:00:20' ), (1003 ,'保利西子湾' ,'上海市' ,'上海市' ,'松江区' ,'广富林路1188弄' ,'2008' ,'塔楼/板楼' ,'1.75' ,'上海保利物业管理' ,'上海城乾房地产开发有限公司' ,'2021-03-16 23:00:20' ,'2021-03-16 23:00:20' ), (1004 ,'万科城市花园' ,'上海市' ,'上海市' ,'松江区' ,'广富林路1188弄' ,'2002' ,'塔楼/板楼' ,'1.5' ,'上海保利物业管理' ,'上海城乾房地产开发有限公司' ,'2021-03-16 23:00:20' ,'2021-03-16 23:00:20' ), (1005 ,'上海阳城' ,'上海市' ,'上海市' ,'闵行区' ,'罗锦路888弄' ,'2002' ,'塔楼/板楼' ,'1.5' ,'上海莲阳物业管理有限公司' ,'上海莲城房地产开发有限公司' ,'2021-03-16 23:00:20' ,'2021-03-16 23:00:20' ); CREATE TABLE `TB_HOUSE_RESOURCES` ( `id` bigint (20 ) NOT NULL AUTO_INCREMENT, `title` varchar (100 ) DEFAULT NULL COMMENT '房源标题' , `estate_id` bigint (20 ) DEFAULT NULL COMMENT '楼盘id' , `building_num` varchar (5 ) DEFAULT NULL COMMENT '楼号(栋)' , `building_unit` varchar (5 ) DEFAULT NULL COMMENT '单元号' , `building_floor_num` varchar (5 ) DEFAULT NULL COMMENT '门牌号' , `rent` int (10 ) DEFAULT NULL COMMENT '租金' , `rent_method` tinyint(1 ) DEFAULT NULL COMMENT '租赁方式,1-整租,2-合租' , `payment_method` tinyint(1 ) DEFAULT NULL COMMENT '支付方式,1-付一押一,2-付三押一,3-付六押一,4-年付押一,5-其它' , `house_type` varchar (255 ) DEFAULT NULL COMMENT '户型,如:2室1厅1卫' , `covered_area` varchar (10 ) DEFAULT NULL COMMENT '建筑面积' , `use_area` varchar (10 ) DEFAULT NULL COMMENT '使用面积' , `floor` varchar (10 ) DEFAULT NULL COMMENT '楼层,如:8/26' , `orientation` varchar (2 ) DEFAULT NULL COMMENT '朝向:东、南、西、北' , `decoration` tinyint(1 ) DEFAULT NULL COMMENT '装修,1-精装,2-简装,3-毛坯' , `facilities` varchar (50 ) DEFAULT NULL COMMENT '配套设施, 如:1,2,3' , `pic` varchar (200 ) DEFAULT NULL COMMENT '图片,最多5张' , `house_desc` varchar (200 ) DEFAULT NULL COMMENT '描述' , `contact` varchar (10 ) DEFAULT NULL COMMENT '联系人' , `mobile` varchar (11 ) DEFAULT NULL COMMENT '手机号' , `time ` tinyint(1 ) DEFAULT NULL COMMENT '看房时间,1-上午,2-中午,3-下午,4-晚上,5-全天' , `property_cost` varchar (10 ) DEFAULT NULL COMMENT '物业费' , `created` datetime DEFAULT NULL , `updated` datetime DEFAULT NULL , PRIMARY KEY (`id`) ) ENGINE= InnoDB AUTO_INCREMENT= 1 DEFAULT CHARSET= utf8 COMMENT= '房源表' ;
POJO BasePOJO 1 2 3 4 5 6 7 package com.haoke.dubbo.server.pojo;@Data public abstract class BasePojo implements Serializable { private Date created; private Date updated; }
房源POJO 1. MybatisPlus逆向工程生成POJO <span id="generator"></span>
mybatis-plus的AutoGenerator插件根据 数据库中的表结构 生成相应的POJO类
创建generator项目
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <modelVersion > 4.0.0</modelVersion > <artifactId > haoke-manage-dubbo-server-generator</artifactId > <dependencies > <dependency > <groupId > org.freemarker</groupId > <artifactId > freemarker</artifactId > </dependency > <dependency > <groupId > com.baomidou</groupId > <artifactId > mybatis-plus-core</artifactId > <version > 3.4.2</version > </dependency > <dependency > <groupId > com.baomidou</groupId > <artifactId > mybatis-plus-generator</artifactId > <version > 3.4.1</version > </dependency > </dependencies > </project >
编写CodeGenerator 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 public class CodeGenerator { public static String scanner (String tip) { Scanner scanner = new Scanner (System.in); StringBuilder help = new StringBuilder (); help.append("请输入" + tip + ":" ); System.out.println(help.toString()); if (scanner.hasNext()) { String ipt = scanner.next(); if (StringUtils.isNotEmpty(ipt)) { return ipt; } } throw new MybatisPlusException ("请输入正确的" + tip + "!" ); } public static void main (String[] args) { AutoGenerator mpg = new AutoGenerator (); GlobalConfig gc = new GlobalConfig (); String projectPath = System.getProperty("user.dir" ); gc.setOutputDir(projectPath + "/src/main/java" ); gc.setAuthor("amostian" ); gc.setOpen(false ); mpg.setGlobalConfig(gc); DataSourceConfig dsc = new DataSourceConfig (); dsc.setUrl("jdbc:mysql://82.157.25.57:4002/haoke?characterEncoding=utf8&useSSL=false&serverTimezone=UTC" ); dsc.setDriverName("com.mysql.cj.jdbc.Driver" ); dsc.setUsername("mycat" ); dsc.setPassword("mycat" ); mpg.setDataSource(dsc); PackageConfig pc = new PackageConfig (); pc.setModuleName(scanner("模块名" )); pc.setParent("com.haoke.dubbo.server" ); mpg.setPackageInfo(pc); InjectionConfig cfg = new InjectionConfig (){ @Override public void initMap () { } }; List<FileOutConfig> focList = new ArrayList <>(); focList.add(new FileOutConfig ("/templates/mapper.xml.ftl" ) { @Override public String outputFile (TableInfo tableInfo) { return projectPath + "/src/main/resources/mapper/" + pc.getModuleName() + "/" + tableInfo.getEntityName() + "Mapper" + StringPool.DOT_XML; } }); cfg.setFileOutConfigList(focList); mpg.setCfg(cfg); mpg.setTemplate(new TemplateConfig ().setXml(null )); StrategyConfig strategy = new StrategyConfig (); strategy.setNaming(NamingStrategy.underline_to_camel); strategy.setColumnNaming(NamingStrategy.underline_to_camel); strategy.setSuperEntityClass("com.haoke.dubbo.server.pojo.BasePojo" ); strategy.setEntityLombokModel(true ); strategy.setRestControllerStyle(true ); strategy.setInclude(scanner("表名" )); strategy.setSuperEntityColumns("id" ); strategy.setControllerMappingHyphenStyle(true ); strategy.setTablePrefix(pc.getModuleName() + "_" ); mpg.setStrategy(strategy); mpg.setTemplateEngine(new FreemarkerTemplateEngine ()); mpg.execute(); } }
运行代码
只需要entity (pojo)
@EqualsAndHashCode(callSuper = true)自动生成equals和 hashcode 方法,一般不需要,所以去掉 @Accessors(chain = true) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 package com.haoke.dubbo.server.pojo;import com.baomidou.mybatisplus.annotation.TableName;import lombok.Data;import lombok.EqualsAndHashCode;@Data @EqualsAndHashCode(callSuper = true) @TableName("TB_HOUSE_RESOURCES") public class HouseResources extends BasePojo { private static final long serialVersionUID = -2471649692631014216L ; private String title; @TableId(value = "ID", type = IdType.AUTO) private Long estateId; private String buildingNum; private String buildingUnit; private String buildingFloorNum; private Integer rent; private Integer rentMethod; private Integer paymentMethod; private String houseType; private String coveredArea; private String useArea; private String floor; private String orientation; private Integer decoration; private String facilities; private String pic; private String houseDesc; private String contact; private String mobile; private Integer time; private String propertyCost; }
房源服务项目结构
将房源业务分为接口层面和实现层面,是为了便于组件化维护。
实现层面是Spring业务,具体实现业务逻辑。 接口层面作为Dubbo的服务导出。 haoke-manage-dubbo-server-house-resources
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <parent > <artifactId > haoke-manage-dubbo-server</artifactId > <groupId > com.haoke.manage</groupId > <version > 1.0-SNAPSHOT</version > </parent > <modelVersion > 4.0.0</modelVersion > <artifactId > haoke-manage-dubbo-server-house-resources</artifactId > <packaging > pom</packaging > <modules > <module > haoke-manage-dubbo-server-house-resources-interface</module > <module > haoke-manage-dubbo-server-house-resources-service</module > </modules > <dependencies > <dependency > <groupId > com.baomidou</groupId > <artifactId > mybatis-plus-boot-starter</artifactId > <version > 3.4.2</version > </dependency > <dependency > <groupId > mysql</groupId > <artifactId > mysql-connector-java</artifactId > <version > 8.0.16</version > </dependency > </dependencies > </project >
房源业务接口 haoke-manage-dubbo-server-house-resources-interface
1 2 3 4 5 6 7 8 9 10 11 12 13 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <parent > <artifactId > haoke-manage-dubbo-server-house-resources</artifactId > <groupId > com.haoke.manage</groupId > <version > 1.0-SNAPSHOT</version > </parent > <modelVersion > 4.0.0</modelVersion > <artifactId > haoke-manage-dubbo-server-house-resources-interface</artifactId > </project >
房源业务实现 haoke-manage-server-house-resources-service
房源服务的实现——Spring业务
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <parent > <artifactId > haoke-manage-dubbo-server-house-resources</artifactId > <groupId > com.haoke.manage</groupId > <version > 1.0-SNAPSHOT</version > </parent > <modelVersion > 4.0.0</modelVersion > <artifactId > haoke-manage-dubbo-server-house-resources-service</artifactId > <dependencies > <dependency > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-starter-jdbc</artifactId > </dependency > <dependency > <groupId > com.haoke.manage</groupId > <artifactId > haoke-manage-dubbo-server-house-resources-interface</artifactId > <version > 1.0-SNAPSHOT</version > </dependency > </dependencies > </project >
相关配置 application.preperties
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 spring.application.name = haoke-manage-dubbo-server-house-resources spring.datasource.driver-class-name =com.mysql.cj.jdbc.Driver spring.datasource.url =jdbc:mysql://82.157.25.25:4002/haoke?characterEncoding=utf8&useSSL=false&serverTimezone=UTC spring.datasource.username =mycat spring.datasource.password =mycat dubbo.scan.basePackages = com.haoke.server.api dubbo.application.name = dubbo-provider-house-resources dubbo.service.version = 1.0.0 dubbo.protocol.name = dubbo dubbo.protocol.port = 20880 dubbo.registry.address = zookeeper://8.140.130.91:2181 dubbo.registry.client = zkclient
新增房源 服务提供方 Dubbo服务定义 haoke-manage-dubbo-server-house-resources-interface
1 2 3 4 5 6 7 8 9 10 11 package com.haoke.server.api;import com.haoke.server.pojo.HouseResources;public interface ApiHouseResourcesService { int saveHouseResources (HouseResources houseResources) ; }
新增房源业务实现 创建SpringBoot应用,实现新增房源服务
连接数据库——Dao层 实现CRUD接口——Service层
Dao层 MybatisPlus配置类
1 2 3 4 5 6 7 8 9 package com.haoke.server.config;import org.mybatis.spring.annotation.MapperScan;import org.springframework.context.annotation.Configuration;@MapperScan("com.haoke.server.mapper") @Configuration public class MybatisPlusConfig {}
HouseResourcesMapper接口
1 2 3 4 5 6 7 package com.haoke.server.mapper;import com.baomidou.mybatisplus.core.mapper.BaseMapper;import com.haoke.dubbo.server.pojo.HouseResources;public interface HouseResourcesMapper extends BaseMapper <HouseResources> {}
Service层 此处实现的是spring的服务,为dubbo服务的具体实现细节,无需对外暴露,同时需要进行事务控制和其他判断逻辑
定义接口
1 2 3 4 5 6 7 8 9 10 11 package com.haoke.server.service;import com.haoke.server.pojo.HouseResources;public interface HouseResourcesService { int saveHouseResources (HouseResources houseResources) ; }
编写实现类
通用CRUD实现
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 package com.haoke.server.service.impl;import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;import com.baomidou.mybatisplus.core.mapper.BaseMapper;import com.baomidou.mybatisplus.core.metadata.IPage;import com.baomidou.mybatisplus.extension.plugins.pagination.Page;import com.haoke.dubbo.server.pojo.BasePojo;import org.springframework.beans.factory.annotation.Autowired;import java.util.Date;import java.util.List;public class BaseServiceImpl <T extends BasePojo >{ @Autowired private BaseMapper<T> mapper; public T queryById (Long id) { return this .mapper.selectById(id); } public List<T> queryAll () { return this .mapper.selectList(null ); } public T queryOne (T record) { return this .mapper.selectOne(new QueryWrapper <>(record)); } public List<T> queryListByWhere (T record) { return this .mapper.selectList(new QueryWrapper <>(record)); } public IPage<T> queryPageListByWhere (T record, Integer page, Integer rows) { return this .mapper.selectPage(new Page <T>(page, rows), new QueryWrapper <> (record)); } public Integer save (T record) { record.setCreated(new Date ()); record.setUpdated(record.getCreated()); return this .mapper.insert(record); } public Integer update (T record) { record.setUpdated(new Date ()); return this .mapper.updateById(record); } public Integer deleteById (Long id) { return this .mapper.deleteById(id); } public Integer deleteByIds (List<Long> ids) { return this .mapper.deleteBatchIds(ids); } public Integer deleteByWhere (T record) { return this .mapper.delete(new QueryWrapper <>(record)); } }
房源相关实现类——HouseResourcesImpl
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 package com.haoke.server.service.impl;import com.alibaba.dubbo.common.utils.StringUtils;import com.haoke.server.pojo.HouseResources;import com.haoke.server.service.HouseResourcesService;import org.springframework.stereotype.Service;import org.springframework.transaction.annotation.Transactional;@Transactional @Service public class HouseResourcesServiceImpl extends BaseServiceImpl implements HouseResourcesService { @Override public int saveHouseResources (HouseResources houseResources) { if (StringUtils.isBlank(houseResources.getTitle())) { return -1 ; } return super .save(houseResources); } }
Dubbo服务实现 暴露新增房源的dubbo服务,将接口作为Dubbo服务导出
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 package com.haoke.server.api;import com.alibaba.dubbo.config.annotation.Service;import com.haoke.server.pojo.HouseResources;import com.haoke.server.service.HouseResourcesService;import org.springframework.beans.factory.annotation.Autowired;@Service(version = "${dubbo.service.version}") public class ApiHoukeResourcesImpl implements ApiHouseResourcesService { @Autowired private HouseResourcesService resourcesService; @Override public int saveHouseResources (HouseResources houseResources) { return this .resourcesService.saveHouseResources(houseResources); } }
Dubbo服务启动 启动SpringBoot程序,将Dubbo服务导出到注册中心
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 package com.haoke.server;import org.springframework.boot.WebApplicationType;import org.springframework.boot.autoconfigure.SpringBootApplication;import org.springframework.boot.builder.SpringApplicationBuilder;@SpringBootApplication public class DubboProvider { public static void main (String[] args) { new SpringApplicationBuilder (DubboProvider.class) .web(WebApplicationType.NONE) .run(args); } }
启用DubboAdmin 1 2 3 cd /opt/incubator-dubbo-ops/ mvn --projects dubbo-admin-server spring-boot:run
查询dubbo服务提供方
dubbo-provider-house-resources,端口为20880
服务消费方 haoke-manage-api-server
为前端系统提供RESTful风格接口 dubbo的消费方
添加依赖 因为dubbo是消费方,需要添加dubbo提供方提供的接口、pojo的依赖
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 <dependencies > <dependency > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-starter-web</artifactId > </dependency > <dependency > <groupId > com.haoke.manage</groupId > <artifactId > haoke-manage-dubbo-server-house-resources-interface</artifactId > <version > 1.0-SNAPSHOT</version > </dependency > </dependencies >
消费方配置文件 1 2 3 4 5 6 7 8 9 10 11 12 13 14 spring.application.name = haoke-manage-api-server server.port = 9091 dubbo.application.name = dubbo-consumer-haoke-manage dubbo.registry.address = zookeeper://8.140.130.91:2181 dubbo.registry.client = zkclient dubbo.service.version = 1.0.0
服务消费方 HouseResourceService用于调用dubbo服务
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 package com.haoke.api.service;import com.alibaba.dubbo.config.annotation.Reference;import com.haoke.server.api.ApiHouseResourcesService;import com.haoke.server.pojo.HouseResources;import org.springframework.stereotype.Service;@Service public class HouseResourceService { @Reference(version = "${dubbo.service.version}") private ApiHouseResourcesService apiHouseResourcesService; public boolean save (HouseResources houseResources) { int result = this .apiHouseResourcesService.saveHouseResources(houseResources); return result==1 ; } }
控制层 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 package com.haoke.api.controller;import com.haoke.api.service.HouseResourceService;import com.haoke.server.pojo.HouseResources;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.http.HttpStatus;import org.springframework.http.ResponseEntity;import org.springframework.stereotype.Controller;import org.springframework.web.bind.annotation.*;@RequestMapping("house/resources") @Controller public class HouseResourcesController { @Autowired private HouseResourceService houseResourceService; @PostMapping @ResponseBody public ResponseEntity<Void> save (@RequestBody HouseResources houseResources) { try { boolean bool = this .houseResourceService.save(houseResources); if (bool){ return ResponseEntity.status(HttpStatus.CREATED).build(); } } catch (Exception e) { e.printStackTrace(); } return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build(); } @GetMapping @ResponseBody public ResponseEntity<String> get () { System.out.println("get House Resources" ); return ResponseEntity.ok("ok" ); } }
测试程序 1 2 3 4 5 6 7 8 9 10 11 package com.haoke.api;import org.springframework.boot.SpringApplication;import org.springframework.boot.autoconfigure.SpringBootApplication;@SpringBootApplication(exclude = {DataSourceAutoConfiguration.class}) public class DubboApiApplication { public static void main (String[] args) { SpringApplication.run(DubboApiApplication.class, args); } }
测试接口
前后端整合 增加model 新建 models 文件夹
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 import { routerRedux } from 'dva/router' ;import { message } from 'antd' ;import { addHouseResource } from '@/services/haoke/haoke' ;export default { namespace : 'house' , state : { }, effects : { *submitHouseForm ({ payload }, { call } ) { console .log ("page model" ) yield call (addHouseResource, payload); message.success ('提交成功' ); } }, reducers : { }, };
增加services 1 2 3 4 5 6 7 8 import request from '@/utils/request' ;export async function addHouseResource (params ) { return request ('/haoke/house/resources' , { method : 'POST' , body : params }); }
修改表单提交地址 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 handleSubmit = e => { const { dispatch, form } = this .props ; e.preventDefault (); form.validateFieldsAndScroll ((err, values ) => { if (!err) { if (values.facilities ){ values.facilities = values.facilities .join ("," ); } if (values.floor_1 && values.floor_2 ){ values.floor = `${values.floor_1 } /${ values.floor_2} ` ; } values.houseType = `${values.houseType_1 } 室${ values.houseType_2 } 厅${ values.houseType_3 } 卫${ values.houseType_4 } 厨${ values.houseType_2 } 阳台` ; delete values.floor_1 ; delete values.floor_2 ; delete values.houseType_1 ; delete values.houseType_2 ; delete values.houseType_3 ; delete values.houseType_4 ; delete values.houseType_5 ; dispatch ({ type : 'house/submitHouseForm' , payload : values, }); } }); };
通过反向代理解决跨域问题 https://umijs.org/zh-CN/config#proxy
1 2 3 4 5 6 7 proxy: { '/haoke/': { target: 'http: changeOrigin: true , pathRewrite: { '^/haoke/': '' } , } , } ,
代理效果:
请求:http://127.0.0.1:8000/haoke/house/resources
实际:http://127.0.0.1:9091/house/resources
房源列表
PageInfo:返回给服务消费方的数据 ApiHouseResourcesService:暴露 Dubbo 服务提供方接口 ApiHaoKeResourcesImpl: Dubbo 服务提供方的实现 HouseResourcesService: spring 服务层定义 HouseResourcesServiceImpl:spring 业务的实现 BaseServiceImpl:Mybatisplus 层访问数据库 1. 定义dubbo服务 haoke-manage-server-house-resources-dubbo-interface
Dubbo服务提供方接口
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 package com.haoke.server.api;import com.haoke.server.pojo.HouseResources;import com.haoke.server.vo.PageInfo;public interface ApiHouseResourcesService { int saveHouseResources (HouseResources houseResources) ; PageInfo<HouseResources> queryHouseResourcesList (int page, int pageSize, HouseResources queryCondition) ; }
2. Dubbo生产方 haoke-manage-dubbo-server-house-resources-service
1. 定义数据模型 服务提供方封装返回的数据
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 package com.haoke.server.vo;import lombok.AllArgsConstructor;import lombok.Data;import java.util.Collections;import java.util.List;@Data @AllArgsConstructor public class PageInfo <T> implements java .io.Serializable{ private Integer total; private Integer pageNum; private Integer pageSize; private List<T> records = Collections.emptyList(); }
2. 服务提供方实现 dubbo服务的实现实际上为调用Spring的服务层业务
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 package com.haoke.server.api;import com.alibaba.dubbo.config.annotation.Service;import com.haoke.server.pojo.HouseResources;import com.haoke.server.service.HouseResourcesService;import com.haoke.server.vo.PageInfo;import org.springframework.beans.factory.annotation.Autowired;@Service(version = "${dubbo.service.version}") public class ApiHaokeResourcesImpl implements ApiHouseResourcesService { @Autowired private HouseResourcesService houseResourcesService; @Override public int saveHouseResources (HouseResources houseResources) { return this .houseResourcesService.saveHouseResources(houseResources); } @Override public PageInfo<HouseResources> queryHouseResourcesList (int page, int pageSize, HouseResources queryCondition) { return this .houseResourcesService.queryHouseResourcesList(page, pageSize, queryCondition); } }
3. 列表业务实现 Dao层 mybatisplus从数据库获取数据
1 2 3 4 5 6 7 8 @Override public PageInfo<HouseResources> queryHouseResourcesList (int page, int pageSize, HouseResources queryCondition) { QueryWrapper<HouseResources> queryWrapper = new QueryWrapper <HouseResources>(queryCondition); queryWrapper.orderByDesc("updated" ); IPage iPage = super .queryPageList(queryWrapper, page, pageSize); return new PageInfo <HouseResources>(Long.valueOf(iPage.getTotal()).intValue() , page, pageSize, iPage.getRecords()); }
Service层 spring的服务层实现查询列表业务
spring服务层定义
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 package com.haoke.server.service;import com.haoke.server.pojo.HouseResources;import com.haoke.server.vo.PageInfo;public interface HouseResourcesService { int saveHouseResources (HouseResources houseResources) ; public PageInfo<HouseResources> queryHouseResourcesList (int page, int pageSize, HouseResources queryCondition) ; }
spring服务层实现
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 package com.haoke.server.service.impl;import com.alibaba.dubbo.common.utils.StringUtils;import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;import com.baomidou.mybatisplus.core.metadata.IPage;import com.haoke.server.pojo.HouseResources;import com.haoke.server.service.HouseResourcesService;import com.haoke.server.vo.PageInfo;import org.springframework.stereotype.Service;import org.springframework.transaction.annotation.Transactional;@Transactional @Service public class HouseResourcesServiceImpl extends BaseServiceImpl implements HouseResourcesService { @Override public int saveHouseResources (HouseResources houseResources) { if (StringUtils.isBlank(houseResources.getTitle())) { return -1 ; } return super .save(houseResources); } @Override public PageInfo<HouseResources> queryHouseResourcesList (int page, int pageSize, HouseResources queryCondition) { QueryWrapper<Object> queryWrapper = new QueryWrapper <>(queryCondition); queryWrapper.orderByDesc("updated" ); IPage iPage = super .queryPageList(queryWrapper, page, pageSize); return new PageInfo <HouseResources>(Long.valueOf(iPage.getTotal()).intValue() , page, pageSize, iPage.getRecords()); } }
3. Dubbo消费方 实现RESTful风格接口
TableResult:返回给前端的vo Pagination:分页信息 HouseResourceService:调用服务提供方提供的接口 HouseResourcesController:服务消费方提供接口给前端调用 1. 定义vo 1 2 3 4 5 6 7 8 9 10 11 12 13 14 @Data @AllArgsConstructor public class TableResult <T> { private List<T> list; private Pagination pagination; } @Data @AllArgsConstructor public class Pagination { private Integer current; private Integer pageSize; private Integer total; }
2. 调用服务提供方 1 2 3 4 5 6 7 8 public TableResult queryList (HouseResources houseResources, Integer currentPage, Integer pageSize) { PageInfo<HouseResources> pageInfo = this .apiHouseResourcesService.queryHouseResourcesList(currentPage, pageSize, houseResources); return new TableResult ( pageInfo.getRecords(), new Pagination (currentPage, pageSize, pageInfo.getTotal())); }
3. 服务消费方提供给前端接口 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 @GetMapping("/list") @ResponseBody public ResponseEntity<TableResult> list (HouseResources houseResources, @RequestParam(name = "currentPage", defaultValue = "1") Integer currentPage, @RequestParam(name = "pageSize",defaultValue = "10") Integer pageSize) { return ResponseEntity.ok(this .houseResourceService.queryList(houseResources, currentPage, pageSize)); }
4. 测试接口
4. 前后端整合
1. 修改前端表结构 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 columns = [ { title : '房源编号' , dataIndex : 'id' , }, { title : '房源信息' , dataIndex : 'title' , }, { title : '图' , dataIndex : 'pic' , render : (text, record, index ) => <ShowPics pics ={text} /> }, { title : '楼栋' , render : (text, record, index ) => `${record.buildingFloorNum } 栋${record.buildingNum} 单元${record.buildingUnit} 号` }, { title : '户型' , dataIndex : 'houseType' }, { title : '面积' , dataIndex : 'useArea' , render : (text, record, index ) => `${text} 平方` }, { title : '楼层' , dataIndex : 'floor' }, { title : '操作' , render : (text, record ) => ( <Fragment > <a onClick ={() => this.handleUpdateModalVisible(true, record)}>查看</a > <Divider type ="vertical" /> <a href ="" > 删除</a > </Fragment > ), }, ];
2. 自定义图片展示组件 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 import React from 'react' ;import { Modal , Button , Carousel } from 'antd' ;class ShowPics extends React.Component { info = () => { Modal .info ({ title : '' , iconType :'false' , width : '800px' , okText : "ok" , content : ( <div style ={{width:650, height: 400 , lineHeight:400 , textAlign: "center "}}> <Carousel autoplay > { this.props.pics.split(',').map((value,index) => <div > <img style ={{ maxWidth:600 ,maxHeight:400 , margin: "0 auto " }} src ={value} /> </div > ) } </Carousel > </div > ), onOk ( ) {}, }); }; constructor (props ){ super (props); this .state ={ btnDisabled : !this .props .pics } } render ( ) { return ( <div > <Button disabled ={this.state.btnDisabled} icon ="picture" shape ="circle" onClick ={() => {this.info()}} /> </div > ) } } export default ShowPics ;
3. model层 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 import { queryResource } from '@/services/haoke/houseResource' ;export default { namespace : 'houseResource' , state : { data : { list : [], pagination : {}, }, }, effects : { *fetch ({ payload }, { call, put } ) { console .log ("houseResource fetch" ) const response = yield call (queryResource, payload); yield put ({ type : 'save' , payload : response, }); } }, reducers : { save (state, action ) { return { ...state, data : action.payload , }; }, }, };
4. 修改数据请求地址 1 2 3 4 5 6 import request from '@/utils/request' ;import { stringify } from 'qs' ;export async function queryResource (params ) { return request (`/haoke/house/resources/list?${stringify(params)} ` ); }
GraphQL 使用GraphQL开发房源接口 实现房源列表查询的接口 简介
官网地址
一种用于前后端 数据查询 方式的规范
RESTful存在的问题 1 2 3 4 GET http://127.0.0.1/user/1 #查询 POST http://127.0.0.1/user #新增 PUT http://127.0.0.1/user #更新 DELETE http://127.0.0.1/user #删除
场景一:
只需某一对象的部分属性,但通过RESTful返回的是这个对象的所有属性
1 2 3 4 5 6 7 8 9 10 #请求 GET http: #响应: { id : 1001 , name : "张三" , age : 20 , address : "北京市" , …… }
场景二:
一个需求,要发起多次请求才能完成
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 #查询用户信息 GET http: #响应: { id : 1001 , name : "张三" , age : 20 , address : "北京市" , …… } #查询用户的身份证信息 GET http: #响应: { id : 8888 , name : "张三" , cardNumber : "999999999999999" , address : "北京市" , …… }
GraphQL的优势 1. 按需索取数据 当请求中只有name属性时,响应结果中只包含name属性,如果请求中添加appearsIn属性,那么结果中就会返回appearsIn的值
演示地址:https://graphql.cn/learn/schema/#type-system
2. 一次查询多个数据
一次请求,不仅查询到了hero数据,而且还查询到了friends数据。节省了网络请求次数
3. API的演进无需划分版本
当API进行升级时,客户端可以不进行升级,可以等到后期一起升级,这样就大大减少了客户端和服务端的耦合度
GraphQL查询的规范 GraphQL定义了一套规范,用来描述语法定义 http://graphql.cn/learn/queries/
规范 $\neq$ 实现
字段 Fields 在GraphQL的查询中,请求结构中包含了所预期结果的结构,这个就是字段。并且响应的结构和请求结构基本一致,这是GraphQL的一个特性,这样就可以让请求发起者很清楚的知道自己想要什么。
参数Arguments 语法:(参数名:参数值)
别名 Aliases 如果一次查询多个 相同对象
,但是 值不同
,这个时候就需要起别名了,否则json的语法就不能通过了
片段 Fragments 查询对的属性如果相同,可以采用片段的方式进行简化定义
GraphQL的schema和类型规范 Schema用于定义数据结构
https://graphql.cn/learn/schema/
Schema定义结构 每一个 GraphQL 服务都有一个 query
类型,可能有一个 mutation
类型。这两个类型和常规对象类型无差,但是它们之所以特殊,是因为它们定义了每一个 GraphQL 查询的入口 。
1 2 3 4 5 6 7 8 9 10 11 12 13 schema { #定义查询 query: UserQuery } type UserQuery{# 定义查询的类型 user(id:ID):User #指定对象以及参数类型 } type User{# 定义对象 id:ID! #!表示该属性必须不可为空 name:String age:Int }
标量类型 Int :有符号 32 位整数。 Float :有符号双精度浮点值。 String :UTF‐8 字符序列。 Boolean : true 或者 false 。 ID :ID 标量类型表示一个唯一标识符,通常用以重新获取对象或者作为缓存中的键 GraphQL支持自定义类型,比如在graphql-java实现中增加了:Long、Byte等。
枚举类型 1 2 3 4 5 6 7 8 9 10 11 12 enum Episode{# 定义枚举 NEWHOPE EMPIRE JEDI } type huma{ id: ID! name: String! appearsIn: [Episode]! #使用枚举类型 表示一个 Episode 数组 homePlanet: String }
接口 interface 一个接口是一个抽象类型,它包含某些字段,而对象类型必须包含这些字段,才能算实现了这个接口
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 interface Character{# 定义接口 id: ID! name: String! friends: [Character] appearsIn: [Episode]! } #实现接口 type Human implememts Character{ id: ID! name: String! friends: [Character]! starship: [Startships]! totalCredits: Int } type Droid implements Character { id: ID! name: String! friends: [Character] appearsIn: [Episode]! primaryFunction: String }
GraphQL的Java实现 官方只是定义了规范并没有做实现,就需要有第三方来进行实现了
官网:https://www.graphql-java.com/
https://www.graphql-java.com/documentation/v16/getting-started/
graphQL并未发布到maven中央仓库中,需要添加第三方仓库,才能下载到依赖
Maven:若使用mirrors配置镜像,则第三方配置不会生效
1. 导入依赖 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <modelVersion > 4.0.0</modelVersion > <groupId > org.example</groupId > <artifactId > graphql</artifactId > <version > 1.0-SNAPSHOT</version > <repositories > <repository > <snapshots > <enabled > false</enabled > </snapshots > <id > bintray-andimarek-graphql-java</id > <name > bintray</name > <url > https://dl.bintray.com/andimarek/graphql-java</url > </repository > </repositories > <dependencies > <dependency > <groupId > org.projectlombok</groupId > <artifactId > lombok</artifactId > </dependency > <dependency > <groupId > com.graphql-java</groupId > <artifactId > graphql-java</artifactId > <version > 11.0</version > </dependency > </dependencies > </project >
2. 安装插件
1 2 3 4 5 6 7 8 9 10 11 12 13 schema { query: UserQuery } type UserQuery{ user(id:ID): User } type User{ id: ID! name: String age: Int }
Java API实现 按需返回 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 public class GraphQLDemo { public static void main (String[] args) { GraphQLObjectType userType = newObject() .name("User" ) .field(newFieldDefinition().name("id" ).type(GraphQLLong)) .field(newFieldDefinition().name("name" ).type(GraphQLString)) .field(newFieldDefinition().name("age" ).type(GraphQLInt)) .build(); GraphQLObjectType userQuery = newObject() .name("userQuery" ) .field(newFieldDefinition() .name("user" ) .type(userType) .dataFetcher(new StaticDataFetcher (new User (1L ,"张三" ,20 ))) ) .build(); GraphQLSchema graphQLSchema = GraphQLSchema.newSchema() .query(userQuery) .build(); GraphQL graphQL = GraphQL.newGraphQL(graphQLSchema).build(); String query = "{user{id,name}}" ; ExecutionResult executionResult = graphQL.execute(query); System.out.println("错误:" + executionResult.getErrors()); System.out.println("结果:" +(Object) executionResult.toSpecification()); } }
查询参数的设置
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 public class GraphQLDemo { public static void main (String[] args) { GraphQLObjectType userType = newObject() .name("User" ) .field(newFieldDefinition().name("id" ).type(GraphQLLong)) .field(newFieldDefinition().name("name" ).type(GraphQLString)) .field(newFieldDefinition().name("age" ).type(GraphQLInt)) .build(); GraphQLObjectType userQuery = newObject() .name("userQuery" ) .field(newFieldDefinition() .name("user" ) .argument(GraphQLArgument.newArgument() .name("id" ) .type("GraphQLLong" ) ) .type(userType) .dataFetcher( Environment->{ Long id = Environment.getArgument("id" ); return new User (id,"张三" ,id.intValue()+10 ); }) ) .build(); GraphQLSchema graphQLSchema = GraphQLSchema.newSchema() .query(userQuery) .build(); GraphQL graphQL = GraphQL.newGraphQL(graphQLSchema).build(); String query = "{user(id:100){id,name,age}}" ; ExecutionResult executionResult = graphQL.execute(query); System.out.println("错误:" + executionResult.getErrors()); System.out.println("结果:" +(Object) executionResult.toSpecification()); } }
3. SDL构建Schema SDL通过插件将GraphQL定义文件转换为java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 schema { query : UserQuery } type UserQuery{ user( id : ID) : User } type User{ id : ID! name : String age : Int card : Card } type Card { cardNumber : String! userId : ID }
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 public class GraphQLSDLDemo { public static void main (String[] args) throws IOException { String fileName = "user.graphql" ; String fileContent = IOUtils.toString(GraphQLSDLDemo.class.getClassLoader().getResource(fileName),"UTF-8" ); TypeDefinitionRegistry tyRegistry = new SchemaParser ().parse(fileContent); RuntimeWiring wiring = RuntimeWiring.newRuntimeWiring() .type("UserQuery" ,builder -> builder.dataFetcher("user" , Environment->{ Long id = Long.parseLong(Environment.getArgument("id" )); Card card = new Card ("number_" +id,id); return new User (id,"张三_" +id,id.intValue()+10 ,card); }) ) .build(); GraphQLSchema graphQLSchema = new SchemaGenerator ().makeExecutableSchema(tyRegistry,wiring); GraphQL graphQL = GraphQL.newGraphQL(graphQLSchema).build(); String query = "{user(id:100){id,name,age,card{cardNumber}}}" ; ExecutionResult executionResult = graphQL.execute(query); System.out.println(executionResult.toSpecification()); } }
id查询房源接口
dubbo服务提供方 HouseResourcesService——Spring服务的Interface 1 public HouseResources queryHouseResourcesById (Long id) ;
HouseResourcesServiceImpl——Spring服务的实现 1 2 3 4 @Override public HouseResources queryHouseResourcesById (Long id) { return (HouseResources) super .queryById(id); }
ApiHouseResourcesService——dubbo服务提供方接口 1 2 3 4 5 6 7 HouseResources queryHouseResourcesById (Long id) ;
ApiHaokeResourcesImpl——dubbo服务提供方实现 1 2 3 4 @Override public HouseResources queryHouseResourcesById (Long id) { return houseResourcesService.queryHouseResourcesById(id); }
dubbo服务消费方 HouseResourceService
1 2 3 4 5 6 7 8 9 10 11 12 13 @Reference(version = "${dubbo.service.version}") private ApiHouseResourcesService apiHouseResourcesService;public HouseResources queryHouseResourcesById (Long id) { return this .apiHouseResourcesService.queryHouseResourcesById(id);
GraphQL接口 导入依赖 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 <repositories > <repository > <snapshots > <enabled > false</enabled > </snapshots > <id > bintray-andimarek-graphql-java</id > <name > bintray</name > <url > https://dl.bintray.com/andimarek/graphql-java</url > </repository > </repositories > <dependency > <groupId > com.graphql-java</groupId > <artifactId > graphql-java</artifactId > <version > 16.0</version > </dependency >
GraphQL定义 haoke.graphql
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 schema { query : HaokeQuery } type HaokeQuery{ HouseResources( id : ID) : HouseResources } type HouseResources{ id : ID! title : String estateId : ID buildingNum : String buildingUnit : String buildingFloorNum : String rent : Int rentMethod : Int paymentMethod : Int houseType : String coveredArea : String useArea : String floor : String orientation : String decoration : Int facilities : String pic : String houseDesc : String contact : String mobile : String time : Int propertyCost : String }
GraphQL组件 graphql —— Bean
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 @Component public class GraphQLProvider { private GraphQL graphQL; @Autowired private HouseResourceService houseResourceService; @PostConstruct public void init () throws FileNotFoundException { File file = ResourceUtils.getFile("classpath:haoke.graphql" ); this .graphQL = GraphQL.newGraphQL( new SchemaGenerator ().makeExecutableSchema( new SchemaParser ().parse(file), RuntimeWiring.newRuntimeWiring() .type("HaokeQuery" ,builder -> builder.dataFetcher("HouseResources" , Environment->{ Long id = Long.parseLong(Environment.getArgument("id" )); return this .houseResourceService.queryHouseResourcesById(id); }) ) .build() ) ).build(); } @Bean GraphQL graphQL () { return this .graphQL; } }
暴露接口 1 2 3 4 5 6 7 8 9 10 11 12 13 @RequestMapping("graphql") @Controller public class GraphQLController { @Autowired private GraphQL graphQL; @GetMapping @ResponseBody public Map<String,Object> graphql (@RequestParam("query") String query) { return this .graphQL.execute(query).toSpecification(); } }
测试
GraphQL组件获取的优化 问题
每当增加查询时,都需要修改该方法
改进思路
编写接口 所有实现查询的逻辑都实现该接口 在GraphQLProvider中使用该接口的实现类进行处理 以后新增查询逻辑只需增加实现类即可 1. 编写MyDataFetcher接口 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 package com.haoke.api.graphql;import graphql.schema.DataFetchingEnvironment;public interface MyDataFetcher { String fieldName () ; Object dataFetcher (DataFetchingEnvironment environment) ; }
2. 实现MyDataFetcher 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 @Component public class HouseResourcesDataFetcher implements MyDataFetcher { @Autowired HouseResourceService houseResourceService; @Override public String fieldName () { return "HouseResources" ; } @Override public Object dataFetcher (DataFetchingEnvironment environment) { Long id = Long.parseLong(environment.getArgument("id" )); return this .houseResourceService.queryHouseResourcesById(id); } }
3. 修改GraphQLProvider 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 this .graphQL = GraphQL.newGraphQL( new SchemaGenerator ().makeExecutableSchema( new SchemaParser ().parse(file), RuntimeWiring.newRuntimeWiring() .type("HaokeQuery" ,builder ->{ for (MyDataFetcher myDataFetcher : myDataFetchers) { builder.dataFetcher( myDataFetcher.fieldName(), Environment->myDataFetcher.dataFetcher(Environment) ); } return builder; } ) .build() )
房源接口(GraphQL) 首页轮播广告 1. 数据结构 请求地址:
响应:
所以,数据只需要返回图片链接即可
2. 数据表设计 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 use haoke; CREATE TABLE `tb_ad` (`id` bigint (20 ) NOT NULL AUTO_INCREMENT, `type` int (10 ) DEFAULT NULL COMMENT '广告类型' , `title` varchar (100 ) DEFAULT NULL COMMENT '描述' , `url` varchar (200 ) DEFAULT NULL COMMENT '图片URL地址' , `created` datetime DEFAULT NULL , `updated` datetime DEFAULT NULL , PRIMARY KEY (`id`)) ENGINE= InnoDB DEFAULT CHARSET= utf8 COMMENT= '广告表' ; INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES ('1' ,'1' , 'UniCity万科天空之城' , 'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/1.jpg' , '2021-3-24 16:36:11' ,'2021-3-24 16:36:16' );INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES ('2' ,'1' , '天和尚海庭前' ,'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/2.jpg' , '2021-3-24 16:36:43' ,'2021-3-24 16:36:37' );INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES ('3' , '1' , '[奉贤 南桥] 光语著' , 'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/3.jpg' , '2021-3-24 16:38:32' ,'2021-3-24 16:38:26' );INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES ('4' ,'1' , '[上海周边 嘉兴] 融创海逸长洲' , 'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/4.jpg' , '2021-3-24 16:39:10' ,'2021-3-24 16:39:13' );
3. 实现查询接口 dubbo服务提供方
1. 创建工程
1 2 3 4 5 6 7 8 <dependencies > <dependency > <groupId > com.haoke.manage</groupId > <artifactId > haoke-manage-dubbo-server-common</artifactId > <version > 1.0-SNAPSHOT</version > </dependency > </dependencies >
1 2 3 4 5 6 7 8 <dependencies > <dependency > <groupId > com.haoke.manage</groupId > <artifactId > haoke-manage-dubbo-server-ad-interface</artifactId > <version > 1.0-SNAPSHOT</version > </dependency > </dependencies >
2.appplication.properties 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 spring.application.name = haoke-manage-dubbo-server-ad spring.datasource.driver-class-name =com.mysql.cj.jdbc.Driver spring.datasource.url =jdbc:mysql://8.140.130.91:3306/myhome\ ?characterEncoding=utf8&useSSL=false&serverTimezone=UTC&autoReconnect=true&allowMultiQueries=true spring.datasource.username =root spring.datasource.password =root spring.datasource.hikari.maximum-pool-size =60 spring.datasource.hikari.idle-timeout =60000 spring.datasource.hikari.connection-timeout =60000 spring.datasource.hikari.validation-timeout =3000 spring.datasource.hikari.login-timeout =5 spring.datasource.hikari.max-lifetime =60000 dubbo.scan.basePackages = com.haoke.server.api dubbo.application.name = dubbo-provider-ad dubbo.service.version = 1.0.0 dubbo.protocol.name = dubbo dubbo.protocol.port = 21880 dubbo.registry.address = zookeeper://8.140.130.91:2181 dubbo.registry.client = zkclient
3.Dao层 POJO
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 @Data @TableName("tb_ad") public class Ad extends BasePojo { private static final long serialVersionUID = -493439243433085768L ; @TableId(value = "id", type = IdType.AUTO) private Long id; private Integer type; private String title; private String url; }
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 package com.haoke.server.api;import com.haoke.server.pojo.Ad;import com.haoke.server.vo.PageInfo;public interface ApiAdService { PageInfo<Ad> queryAdList (Integer type, Integer page, Integer pageSize) ; }
AdMapper 1 2 3 4 5 6 package com.haoke.server.mapper;import com.baomidou.mybatisplus.core.mapper.BaseMapper;import com.haoke.server.pojo.Ad;public interface AdMapper extends BaseMapper <Ad> {}
MybatisPlusConfig 分页配置
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 package com.haoke.server.config;import com.baomidou.mybatisplus.annotation.DbType;import com.baomidou.mybatisplus.extension.plugins.MybatisPlusInterceptor;import com.baomidou.mybatisplus.extension.plugins.PaginationInterceptor;import com.baomidou.mybatisplus.extension.plugins.inner.PaginationInnerInterceptor;import org.mybatis.spring.annotation.MapperScan;import org.springframework.context.annotation.Bean;import org.springframework.context.annotation.Configuration;@MapperScan("com.haoke.server.mapper") @Configuration public class MybatisPlusConfig { @Bean public MybatisPlusInterceptor mybatisPlusInterceptor () { MybatisPlusInterceptor interceptor = new MybatisPlusInterceptor (); PaginationInnerInterceptor paginationInnerInterceptor = new PaginationInnerInterceptor (); paginationInnerInterceptor.setDbType(DbType.MYSQL); interceptor.addInnerInterceptor(paginationInnerInterceptor); return interceptor; } }
4.Service层 实现业务
编写接口:
1 2 3 4 5 6 7 8 package com.haoke.server.service;import com.haoke.server.pojo.Ad;import com.haoke.server.vo.PageInfo;public interface AdService { PageInfo<Ad> queryAdList (Ad ad, Integer page, Integer pageSize) ; }
实现接口:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 package com.haoke.server.service.impl;import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;import com.baomidou.mybatisplus.core.metadata.IPage;import com.haoke.server.pojo.Ad;import com.haoke.server.service.AdService;import com.haoke.server.service.BaseServiceImpl;import com.haoke.server.vo.PageInfo;import org.springframework.stereotype.Service;@Service public class AdServiceImpl extends BaseServiceImpl implements AdService { @Override public PageInfo<Ad> queryAdList (Ad ad, Integer page, Integer pageSize) { QueryWrapper queryWrapper = new QueryWrapper (); queryWrapper.orderByDesc("updated" ); queryWrapper.eq("type" ,ad.getType()); IPage iPage = super .queryPageList(queryWrapper,page,pageSize); return new PageInfo <>(Long.valueOf(iPage.getTotal()).intValue(),page,pageSize,iPage.getRecords()); } }
5.dubbo服务实现类 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 package com.haoke.server.api;import com.haoke.server.pojo.Ad;import com.haoke.server.vo.PageInfo;public interface ApiAdService { PageInfo<Ad> queryAdList (Integer type, Integer page, Integer pageSize) ; }
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 package com.haoke.server.api;import com.alibaba.dubbo.config.annotation.Service;import com.haoke.server.pojo.Ad;import com.haoke.server.service.AdService;import com.haoke.server.vo.PageInfo;import org.springframework.beans.factory.annotation.Autowired;@Service(version = "${dubbo.service.version}") public class ApiAdServiceImpl implements ApiAdService { @Autowired private AdService adService; @Override public PageInfo<Ad> queryAdList (Integer type, Integer page, Integer pageSize) { Ad ad = new Ad (); ad.setType(type); return this .adService.queryAdList(ad,page,pageSize); } }
6.编写启动类 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 package com.haoke.server;import org.springframework.boot.WebApplicationType;import org.springframework.boot.autoconfigure.SpringBootApplication;import org.springframework.boot.builder.SpringApplicationBuilder;@SpringBootApplication public class AdDubboProvider { public static void main (String[] args) { new SpringApplicationBuilder (AdDubboProvider.class) .web(WebApplicationType.NONE) .run(args); } }
4.API实现(Dubbo消费方) 1. 导入依赖 1 2 3 4 5 6 <dependency > <groupId > com.haoke.manage</groupId > <artifactId > haoke-manage-dubbo-server-ad-interface</artifactId > <version > 1.0-SNAPSHOT</version > </dependency >
2. 编写WebResult
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 package com.haoke.api.vo;import com.fasterxml.jackson.annotation.JsonIgnore;import lombok.AllArgsConstructor;import lombok.Data;import java.util.HashMap;import java.util.List;import java.util.Map;@Data @AllArgsConstructor public class WebResult { @JsonIgnore private int status; @JsonIgnore private String msg; @JsonIgnore private List<?> list; @JsonIgnore public static WebResult ok (List<?> list) { return new WebResult (200 , "成功" , list); } @JsonIgnore public static WebResult ok (List<?> list, String msg) { return new WebResult (200 , msg, list); } public Map<String, Object> getData () { HashMap<String, Object> data = new HashMap <String, Object>(); data.put("list" , this .list); return data; } public Map<String, Object> getMeta () { HashMap<String, Object> meta = new HashMap <String, Object>(); meta.put("msg" , this .msg); meta.put("status" , this .status); return meta; } }
3.编写Service 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 package com.haoke.api.service;import com.alibaba.dubbo.config.annotation.Reference;import com.haoke.api.vo.WebResult;import com.haoke.server.api.ApiAdService;import com.haoke.server.pojo.Ad;import com.haoke.server.vo.PageInfo;import org.springframework.stereotype.Service;import java.util.ArrayList;import java.util.HashMap;import java.util.List;import java.util.Map;@Service public class AdService { @Reference(version = "1.0.0") private ApiAdService apiAdService; public PageInfo<Ad> queryAdList (Integer type, Integer page, Integer pageSize) { return this .apiAdService.queryAdList(type, page, pageSize); } }
4.Controller 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 package com.haoke.api.controller;import com.haoke.api.service.AdService;import com.haoke.api.vo.WebResult;import com.haoke.server.pojo.Ad;import com.haoke.server.vo.PageInfo;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.web.bind.annotation.CrossOrigin;import org.springframework.web.bind.annotation.GetMapping;import org.springframework.web.bind.annotation.RequestMapping;import org.springframework.web.bind.annotation.RestController;import java.util.ArrayList;import java.util.HashMap;import java.util.List;import java.util.Map;@RequestMapping("ad") @RestController @CrossOrigin public class AdController { @Autowired private AdService adService; @GetMapping public WebResult queryIndexad () { PageInfo<Ad> pageInfo = this .adService.queryAdList(1 ,1 ,3 ); List<Ad> ads = pageInfo.getRecords(); List<Map<String,Object>> data = new ArrayList <>(); for (Ad ad : ads) { Map<String,Object> map = new HashMap <>(); map.put("original" ,ad.getUrl()); data.add(map); } return WebResult.ok(data); } }
测试
5. 整合前端系统 修改home.js文件中请求地址
1 2 3 4 5 let swipe = new Promise ((resolve, reject ) => { axios.get ('http://127.0.0.1:9091/ad' ).then ((data )=> { resolve (data.data .list ); }); })
跨域问题:
6. 广告的GraphQL接口 1. 目标数据结构 1 2 3 4 5 6 7 8 9 10 11 12 13 { "list" : [ { "original" : "http://itcast-haoke.oss-cnqingdao.aliyuncs.com/images/2018/11/26/15432030275359146.jpg" }, { "original" : "http://itcast-haoke.oss-cnqingdao.aliyuncs.com/images/2018/11/26/15432029946721854.jpg" }, { "original" : "http://itcast-haoke.oss-cnqingdao.aliyuncs.com/images/2018/11/26/1543202958579877.jpg" } ] }
2. graphql定义语句 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 type HaokeQuery{ HouseResourcesList( page : Int, pageSize : Int) : TableResult HouseResources( id : ID) : HouseResources IndexAdList : IndexAdResult } type IndexAdResult{ list : [ IndexAdResultData] } type IndexAdResultData{ original : String }
3. 根据GraphQL结构编写VO 1 2 3 4 5 6 7 8 9 10 11 12 13 14 package com.haoke.api.vo.ad.index;import lombok.AllArgsConstructor;import lombok.Data;import lombok.NoArgsConstructor;import java.util.List;@Data @AllArgsConstructor @NoArgsConstructor public class IndexAdResult { private List<IndexAdResultData> list; }
1 2 3 4 5 6 7 8 9 10 11 12 package com.haoke.api.vo.ad.index;import lombok.AllArgsConstructor;import lombok.Data;import lombok.NoArgsConstructor;@Data @AllArgsConstructor @NoArgsConstructor public class IndexAdResultData { private String original; }
4. IndexAdDataFetcher 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 package com.haoke.api.graphql.myDataFetcherImpl;import com.haoke.api.graphql.MyDataFetcher;import com.haoke.api.service.AdService;import com.haoke.api.vo.WebResult;import com.haoke.api.vo.ad.index.IndexAdResult;import com.haoke.api.vo.ad.index.IndexAdResultData;import com.haoke.server.pojo.Ad;import com.haoke.server.vo.PageInfo;import graphql.schema.DataFetchingEnvironment;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.stereotype.Component;import java.util.ArrayList;import java.util.List;@Component public class IndexAdDataFetcher implements MyDataFetcher { @Autowired private AdService adService; @Override public String fieldName () { return "IndexAdList" ; } @Override public Object dataFetcher (DataFetchingEnvironment environment) { PageInfo<Ad> pageInfo = this .adService.queryAdList(1 , 1 , 3 ); List<Ad> ads = pageInfo.getRecords(); List<IndexAdResultData> list = new ArrayList <>(); for (Ad ad : ads) { list.add(new IndexAdResultData (ad.getUrl())); } return new IndexAdResult (list); } }
5. 测试 1 2 3 4 5 6 7 { IndexAdList{ list{ original } } }
7. GraphQL客户端
参考文档:https://www.apollographql.com/docs/react/get-started/
1. 安装依赖 1 npm install @apollo/client graphql
2. 创建客户端 1 2 3 4 5 import { ApolloClient , gql } from '@apollo/client' ;const client = new ApolloClient ({ uri : 'http://127.0.0.1:9091/graphql' , });
3. 定义查询 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 const GET_INDEX_ADS = gql`{ IndexAdList{ list{ original } } } ` ;let swipe = new Promise ((resolve, reject ) => { client.query ({query : GET_INDEX_ADS }).then (result => resolve (result.data .IndexAdList .list )); })
4. 测试
两个问题:
GraphQL服务没有支持cross,Controller上标注@CrossOrigin Apollo Client发起的数据请求为POST请求,现在实现的GraphQL仅仅实现了GET请求处理 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 package com.haoke.api.controller;import com.fasterxml.jackson.databind.JsonNode;import com.fasterxml.jackson.databind.ObjectMapper;import graphql.GraphQL;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.stereotype.Controller;import org.springframework.web.bind.annotation.*;import java.io.IOException;import java.util.HashMap;import java.util.Map;@RequestMapping("graphql") @Controller @CrossOrigin public class GraphQLController { @Autowired private GraphQL graphQL; private static final ObjectMapper MAPPER = new ObjectMapper (); @GetMapping @ResponseBody public Map<String,Object> graphql (@RequestParam("query") String query) { return this .graphQL.execute(query).toSpecification(); } @PostMapping @ResponseBody public Map<String, Object> postGraphql (@RequestBody String json) throws IOException { try { JsonNode jsonNode = MAPPER.readTree(json); if (jsonNode.has("query" )){ String query = jsonNode.get("query" ).asText(); return this .graphQL.execute(query).toSpecification(); } }catch (IOException e){ e.printStackTrace(); } Map<String,Object> error = new HashMap <>(); error.put("status" ,500 ); error.put("msg" ,"查询出错" ); return error; } }
房源信息列表 1. 查询语句定义 haoke.graphql
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 schema { query: HaokeQuery } type HaokeQuery{ #分页查询房源信息-应用于前台房源信息 HouseResourcesList(page:Int, pageSize:Int):TableResult # 通过Id查询房源信息 HouseResources(id:ID): HouseResources #首页广告图-应用于前台首页 IndexAdList: IndexAdResult } type HouseResources{ id:ID! title:String estateId:ID buildingNum:String buildingUnit:String buildingFloorNum:String rent:Int rentMethod:Int paymentMethod:Int houseType:String coveredArea:String useArea:String floor:String orientation:String decoration:Int facilities:String pic:String houseDesc:String contact:String mobile:String time:Int propertyCost:String } type TableResult{ list: [HouseResources] pagination: Pagination } type Pagination{ current:Int pageSize:Int total:Int }
2.DataFetcher HouseResourcesListDataFetcher
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 @Component public class HouseResourcesListDataFetcher implements MyDataFetcher { @Autowired HouseResourceService houseResourceService; @Override public String fieldName () { return "HouseResourcesList" ; } @Override public Object dataFetcher (DataFetchingEnvironment environment) { Integer page = environment.getArgument("page" ); if (page == null ){ page = 1 ; } Integer pageSize = environment.getArgument("pageSize" ); if (pageSize == null ){ pageSize = 5 ; } return this .houseResourceService.queryList(null , page, pageSize); } }
3.GraphQL参数 问题分析:上述 首页轮播广告查询接口
中的参数是固定的
实际应用中要实现根据前端的请求参数设置参数查询
https://graphql.cn/learn/queries/#variables
一种办法使直接将参数动态的设置到请求体(POST)或URL(GET)中,缺点就是可以直接通过修改查询字符串来自行获取数据。
GraphQL 拥有一级方法将动态值提取到查询之外,然后作为分离的字典传进去。这些动态值即称为变量 。
前台系统发送的参数分析 1 2 3 4 5 6 query hk( $id :ID) { HouseResources( id : $id ){ id title } }
GraphQL发送的数据如上,后端需处理请求并返回相应的数据
4. 后端处理参数
由GraphQL的调用流程可知,传入到后端的GraphQL字符串最终会被构造成一个 ExecutionInput
对象
GraphQLController
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 package com.haoke.api.controller;@RequestMapping("graphql") @Controller @CrossOrigin public class GraphQLController { @Autowired private GraphQL graphQL; private static final ObjectMapper MAPPER = new ObjectMapper (); @GetMapping @ResponseBody public Map<String,Object> graphql (@RequestParam("query") String query, @RequestParam(value = "variables",required = false) String variablesJSON, @RequestParam(value = "operationName",required = false) String operationName) { try { Map<String, Object> variables = MAPPER.readValue(variablesJSON, MAPPER.getTypeFactory().constructMapType(HashMap.class,String.class,Object.class)); return this .executeGraphQLQuery(query,operationName,variables); } catch (JsonProcessingException e) { e.printStackTrace(); } Map<String,Object> error = new HashMap <>(); error.put("status" ,500 ); error.put("msg" ,"查询出错" ); return error; } @PostMapping @ResponseBody public Map<String, Object> postGraphql (@RequestBody Map<String,Object> map) throws IOException { try { String query = (String) map.get("query" ); if (null == query){ query = "" ; } String operationName = (String) map.get("operationName" ); if (null == operationName){ operationName = "" ; } Map variables = (Map) map.get("variables" ); if (variables == null ){ variables = Collections.EMPTY_MAP; } return this .executeGraphQLQuery(query,operationName,variables); } catch (Exception e) { e.printStackTrace(); } Map<String,Object> error = new HashMap <>(); error.put("status" ,500 ); error.put("msg" ,"查询出错" ); return error; } private Map<String, Object> executeGraphQLQuery (String query,String operationName,Map<String,Object> variables) { return this .graphQL.execute( ExecutionInput.newExecutionInput() .query(query) .variables(variables) .operationName(operationName) .build() ).toSpecification(); } }
5. 查询字符串 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 query HouseResourcesList( $pageSize : Int, $page : Int) { HouseResourcesList( pageSize : $pageSize , page : $page ) { list { id title pic title coveredArea orientation floor rent } } } { "pageSize" : 2 , "page" : 1 }
6. 改造list.js页面 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 import React from 'react' ;import { withRouter } from 'react-router' ;import { Icon ,Item } from 'semantic-ui-react' ;import config from '../../common.js' ;import { ApolloClient , gql , InMemoryCache } from '@apollo/client' ;const client = new ApolloClient ({ uri : 'http://127.0.0.1:9091/graphql' , cache : new InMemoryCache () }); const QUERY_LIST = gql` query HouseResourcesList( $pageSize : Int, $page : Int) { HouseResourcesList( pageSize : $pageSize , page : $page ) { list { id title pic title coveredArea orientation floor rent } } } ` ;class HouseList extends React.Component { constructor (props ) { super (props); this .state = { listData : [], typeName : '' , type : null , loadFlag : false }; } goBack = () => { console .log (this .props .history ) this .props .history .goBack (); } componentDidMount = () => { const {query} = this .props .location .state ; this .setState ({ typeName : query.name , type : query.type }) client.query ({query :QUERY_LIST ,variables :{"pageSize" :2 ,"page" :1 }}).then (result => { console .log (result) this .setState ({ listData : result.data .HouseResourcesList .list , loadFlag : true }) }) } render ( ) { let list = null ; if (this .state .loadFlag ) { list = this .state .listData .map (item => { return ( <Item key ={item.id} > <Item.Image src ={item.pic.split( ',')[0 ]}/> <Item.Content > <Item.Header > {item.title}</Item.Header > <Item.Meta > <span className ='cinema' > {item.coveredArea} ㎡/{item.orientation}/{item.floor}</span > </Item.Meta > <Item.Description > 上海 </Item.Description > <Item.Description > {item.rent}</Item.Description > </Item.Content > </Item > ) }); } return ( <div className = 'house-list' > <div className = "house-list-title" > <Icon onClick ={this.goBack} name = 'angle left' size = 'large' /> {this.state.typeName} </div > <div className = "house-list-content" > <Item.Group divided unstackable > {list} </Item.Group > </div > </div > ); } } export default withRouter (HouseList );
更新房源数据
1. Controller haoke-manage-api-server
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 @PutMapping @ResponseBody public ResponseEntity<Void> update (@RequestBody HouseResources houseResources) { try { boolean bool = this .houseResourceService.update(houseResources); if (bool) { return ResponseEntity.status(HttpStatus.NO_CONTENT).build(); } } catch (Exception e) { e.printStackTrace(); } return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build(); }
2. Service haoke-manage-api-server
1 2 3 public boolean update (HouseResources houseResources) { return this .apiHouseResourcesService.updateHouseResources(houseResources); }
3. 修改dubbo服务 haoke-manage-dubbo-server-house-resources-interface
ApiHouserResourcesService
1 2 3 4 5 6 7 boolean updateHouseResources (HouseResources houseResources) ;
实现类ApiHouseResourcesServiceImpl
1 2 3 4 5 6 7 8 9 10 @Override public boolean updateHouseResources (HouseResources houseResources) { return this .houseResourcesService.updateHouseResources(houseResources); }
修改业务Service:HouseResourcesServiceImpl
1 2 3 4 @Override public boolean updateHouseResources (HouseResources houseResources) { return super .update(houseResources)==1 ; }
BaseServiceImpl
1 2 3 4 5 6 7 8 9 public Integer update (T record) { record.setUpdated(new Date ()); return this .mapper.updateById(record); }
编写后台页面 1. 修改房源列表页 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 render : (text, record ) => ( <Fragment > <a onClick ={() => this.handleUpdateModalVisible(true, record)}>查看</a > <Divider type ="vertical" /> {/* 弹窗组件 */} <EditResource record ={record} reload ={this.reload.bind(this)} /> <Divider type ="vertical" /> <a href ="" > 删除</a > </Fragment > ), reload ( ){ const { dispatch } = this .props ; dispatch ({ type : 'houseResource/fetch' }); }
2. EditResource.js 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 import React from 'react' ;import {Card , Checkbox , Form , Input , Modal , Select } from "antd" ;import {connect} from "dva" ;import PicturesWall from "../Utils/PicturesWall" ;const FormItem = Form .Item ;const InputGroup = Input .Group ;const CheckboxGroup = Checkbox .Group ;const { TextArea } = Input ;const formItemLayout = { labelCol : { xs : { span : 24 }, sm : { span : 7 }, }, wrapperCol : { xs : { span : 24 }, sm : { span : 12 }, md : { span : 10 }, }, }; const paymentMethod = [ "" , "付一押一" , "付三押一" , "付六押一" , "年付押一" , "其他" ] const decoration = [ "" , "精装" , "简装" , "毛坯" ] const rentMethod = [ "" , "整租" , "合租" ] const time = [ "" , "上午" , "中午" , "下午" , "晚上" , "全天" ] const facilities = [ "" , "水" , "电" , "煤气/天然气" , "暖气" , "有线电视" , "宽带" , "电梯" , "车位/车库" , "地下室/储藏室" ] function isChinese (temp ){ const re=/^[\u3220-\uFA29]+$/ ; if (re.test (temp)) return true ; return false ; } @connect () @Form .create () class EditResource extends React.Component { constructor (props ){ super (props); console .log ("====传来的信息=====" ) console .log (this .props .record ) this .state ={ visible :false , pics :new Set () }; } showModal = () => { this .setState ({ visible : true }); }; handleCancel = () => { this .setState ({ visible : false , }); }; handleSave = () => { const { dispatch, form, record } = this .props ; form.validateFieldsAndScroll ((err, values ) => { if (!err) { values.id = record.id ; if (isChinese (values.time )){ for (let i = 1 ; i < time.length ; i++) { if (time[i]==values.time ) values.time =i; } } if (isChinese (values.paymentMethod )){ for (let i = 1 ; i < paymentMethod.length ; i++) { if (paymentMethod[i]==values.paymentMethod ) values.paymentMethod =i; } } if (isChinese (values.rentMethod )){ for (let i = 1 ; i < rentMethod.length ; i++) { if (rentMethod[i]==values.rentMethod ) values.rentMethod =i; } } if (isChinese (values.decoration )){ for (let i = 1 ; i < decoration.length ; i++) { if (decoration[i]==values.decoration ) values.decoration =i; } } if (values.floor_1 && values.floor_2 ){ values.floor = `${values.floor_1 } /${ values.floor_2} ` ; } if (values.facilities ){ values.facilities = values.facilities .join ("," ); } values.buildingNum = record.buildingNum ; values.buildingUnit = record.buildingUnit ; values.buildingFloorNum = record.buildingFloorNum ; delete values.building ; if (this .state .pics .size > 0 ){ values.pic = [...this .state .pics ].join (',' ); }else { values.pic = record.pic ; } console .log ("====提交的信息=====" ) console .log (values) dispatch ({ type : 'house/updateHouseForm' , payload : values, }); setTimeout (()=> { this .handleCancel (); this .props .reload (); },500 ) } }); }; handleFileList = (obj )=> { const pics = new Set (); obj.forEach ((v, k ) => { if (v.response ){ pics.add (v.response .name ); } if (v.url ){ pics.add (v.url ); } }); this .setState ({ pics }) } render ( ){ const {record} = this .props ; const { form : { getFieldDecorator } } = this .props ; return ( <React.Fragment > <a onClick ={() => {this.showModal()}}>编辑</a > <Modal title ="编辑" width ={750} visible ={this.state.visible} onOk ={() => {this.handleSave()}} onCancel={()=>{this.handleCancel()}} destroyOnClose > <div style ={{ overflowY: 'auto '}}> <Form hideRequiredMark style ={{ marginTop: 8 }}> <Card bordered ={false} title ="出租信息" > <FormItem {...formItemLayout } label ="房源信息" > {getFieldDecorator('title',{initialValue:record.title ,rules:[{ required: true, message:"此项为必填项" }]})(<Input style ={{ width: '100 %' }} disabled ={false} /> )} </FormItem > <FormItem {...formItemLayout } label ="联系人" > {getFieldDecorator('contact',{initialValue:record.contact ,rules:[{ required: true, message:"此项为必填项" }]})(<Input style ={{ width: '100 %' }} /> )} </FormItem > <FormItem {...formItemLayout } label ="联系方式" > {getFieldDecorator('mobile',{initialValue:record.mobile ,rules:[{ required: true, message:"此项为必填项" }]})(<Input style ={{ width: '100 %' }} /> )} </FormItem > <FormItem {...formItemLayout } label ="看房时间" > {getFieldDecorator('time',{initialValue:time[record.time],rules:[{ required: true, message:"此项为必填项" }]}) ( <Select onSelect ={record.time} style ={{ width: '30 %' }}> <Option value ="1" > 上午</Option > <Option value ="2" > 中午</Option > <Option value ="3" > 下午</Option > <Option value ="4" > 晚上</Option > <Option value ="5" > 全天</Option > </Select > )} </FormItem > <FormItem {...formItemLayout } label ="租金" > <InputGroup compact > {getFieldDecorator('rent',{initialValue:record.rent ,rules:[{ required: true, message:"此项为必填项" }]})(<Input style ={{ width: '50 %' }} addonAfter ="元/月" /> )} </InputGroup > </FormItem > <FormItem {...formItemLayout } label ="物业费" > <InputGroup compact > {getFieldDecorator('propertyCost',{initialValue:record.propertyCost ,rules:[{ required: true, message:"此项为必填项" }]})(<Input style ={{ width: '50 %' }} addonAfter ="元/月" /> )} </InputGroup > </FormItem > <FormItem {...formItemLayout } label ="支付方式" > {getFieldDecorator('paymentMethod',{initialValue:paymentMethod[record.paymentMethod],rules:[{ required: true, message:"此项为必填项" }]}) ( <Select onSelect ={record.paymentMethod} style ={{ width: '50 %' }}> <Option value ="1" > 付一押一</Option > <Option value ="2" > 付三押一</Option > <Option value ="3" > 付六押一</Option > <Option value ="4" > 年付押一</Option > <Option value ="5" > 其它</Option > </Select > )} </FormItem > <FormItem {...formItemLayout } label ="租赁方式" > {getFieldDecorator('rentMethod',{initialValue:rentMethod[record.rentMethod],rules:[{ required: true, message:"此项为必填项" }]}) ( <Select style ={{ width: '50 %' }}> <Option value ="1" > 整租</Option > <Option value ="2" > 合租</Option > </Select > )} </FormItem > </Card > <Card bordered ={false} title ="房源信息" > <FormItem {...formItemLayout } label ="建筑面积" > <InputGroup compact > {getFieldDecorator('coveredArea',{initialValue:record.coveredArea,rules:[{ required: true, message:"此项为必填项" }]})(<Input style ={{ width: '40 %' }} addonAfter ="平米" /> )} </InputGroup > </FormItem > <FormItem {...formItemLayout } label ="使用面积" > <InputGroup compact > {getFieldDecorator('useArea',{initialValue:record.useArea,rules:[{ required: true, message:"此项为必填项" }]})(<Input style ={{ width: '40 %' }} addonAfter ="平米" /> )} </InputGroup > </FormItem > <FormItem {...formItemLayout } label ="楼栋" > <InputGroup compact > {getFieldDecorator('building',{initialValue:`${record.buildingNum}栋${record.buildingUnit}单元${record.buildingFloorNum}号`,rules:[{ required: true, message:"此项为必填项" }]})(<Input disabled style ={{ width: '55 %' }} /> )} </InputGroup > </FormItem > <FormItem {...formItemLayout } label ="楼层" > <InputGroup compact > {getFieldDecorator('floor_1',{initialValue:record.floor.toString().split('/')[0],rules:[{ required: true, message:"此项为必填项" }]})(<Input disabled style ={{ width: '45 %' }} addonBefore ="第" addonAfter ="层" /> )} {getFieldDecorator('floor_2',{initialValue:record.floor.toString().split('/')[1],rules:[{ required: true, message:"此项为必填项" }]})(<Input disabled style ={{ width: '45 %'}} addonBefore ="总" addonAfter ="层" /> )} </InputGroup > </FormItem > <FormItem {...formItemLayout } label ="朝向" > {getFieldDecorator('orientation',{initialValue:record.orientation,rules:[{ required: true, message:"此项为必填项"}]}) ( <Select disabled style ={{ width: '20 %' }}> <Option value ="南" > 南</Option > <Option value ="北" > 北</Option > <Option value ="东" > 东</Option > <Option value ="西" > 西</Option > </Select > )} </FormItem > <FormItem {...formItemLayout } label ="户型" > <InputGroup compact > {getFieldDecorator('houseType',{initialValue:record.houseType ,rules:[{ required: true, message:"此项为必填项" }]})(<Input disabled style ={{ width: '55 %' }} /> )} </InputGroup > </FormItem > <FormItem {...formItemLayout } label ="装修" > {getFieldDecorator('decoration',{initialValue:decoration[record.decoration],rules:[{ required: true, message:"此项为必填项" }]}) ( <Select style ={{ width: '35 %' }}> <Option value ="1" > 精装</Option > <Option value ="2" > 简装</Option > <Option value ="3" > 毛坯</Option > </Select > )} </FormItem > <FormItem {...formItemLayout } label ="配套设施" > {getFieldDecorator('facilities',{initialValue:record.facilities.split(','),rules:[{ required: true, message:"此项为必填项" }]}) ( <CheckboxGroup options ={[ { label: '水 ', value: '1 ' }, { label: '电 ', value: '2 ' }, { label: '煤气 /天然气 ', value: '3 ' }, { label: '暖气 ', value: '4 ' }, { label: '有线电视 ', value: '5 ' }, { label: '宽带 ', value: '6 ' }, { label: '电梯 ', value: '7 ' }, { label: '车位 /车库 ', value: '8 ' }, { label: '地下室 /储藏室 ', value: '9 ' } ]} /> )} </FormItem > </Card > <Card bordered ={false} title ="图片信息" > <FormItem {...formItemLayout } label ="房源描述" > {getFieldDecorator('houseDesc',{initialValue:record.houseDesc,rules:[{ required: false}]}) ( <TextArea autosize ={{ minRows: 4 , maxRows: 10 }} /> )} <span > 请勿填写联系方式或与房源无关信息以及图片、链接或名牌、优秀、顶级、全网首发、零距离、回报率等词汇。</span > </FormItem > <FormItem {...formItemLayout } label ="上传室内图" > <PicturesWall value ={record.pic} handleFileList ={this.handleFileList.bind(this)} fileList ={record.pic} /> </FormItem > </Card > </Form > </div > </Modal > </React.Fragment > ) } } export default EditResource ;
3. 修改提交逻辑 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 import { routerRedux } from 'dva/router' ;import { message } from 'antd' ;import { addHouseResource,updateHouseResource } from '@/services/haoke/haoke' ;export default { namespace : 'house' , state : { }, effects : { *submitHouseForm ({ payload }, { call } ) { console .log ("page model" ) yield call (addHouseResource, payload); message.success ('提交成功' ); }, *updateHouseForm ({ payload }, { call } ) { console .log ("uodateHouseForm" ) yield call (updateHouseResource, payload); message.success ('提交成功' ); } }, reducers : {} };
4. 修改service逻辑 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 import request from '@/utils/request' ;export async function addHouseResource (params ) { return request ('/haoke/house/resources' , { method : 'POST' , body : params }); } export async function updateHouseResource (params ) { console .log (params) return request ('/haoke/house/resources' , { method : 'PUT' , body : params }); }
5. 窗口销毁
关闭之后要销毁,否则影响下一项更新操作
数据库连接问题 HikariPool-1 - Connection is not available, request timed out after JDBC超时,Hikari设置了最长请求时间为30s,涉及多线程问题,原因还不清楚
1 2 3 4 5 6 7 8 9 10 spring: datasource: hikari: maximum-pool-size: 60 data-source-properties: setIdleTimeout: 60000 setConnectionTimeout: 60000 setValidationTimeout: 3000 setLoginTimeout: 5 setMaxLifetime: 60000
抽取common工程 分析:
多个dubbo服务,需要抽取公共的类、方法到common工程中 实现独立的dubbo服务,便于后期的扩展和维护 在dubbo服务提供方的工程中(haoke-manage-dubbo-server
),将BasePOJO、BaseServiceImpl、vo.PageInfo移至该工程;导入公有依赖
其他工程,依赖此工程,并将自己工程中的相关类删除
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <parent > <artifactId > haoke-manage-dubbo-server</artifactId > <groupId > com.haoke.manage</groupId > <version > 1.0-SNAPSHOT</version > </parent > <modelVersion > 4.0.0</modelVersion > <artifactId > haoke-manage-dubbo-server-common</artifactId > <dependencies > <dependency > <groupId > com.baomidou</groupId > <artifactId > mybatis-plus-boot-starter</artifactId > <version > 3.4.2</version > </dependency > <dependency > <groupId > mysql</groupId > <artifactId > mysql-connector-java</artifactId > <version > 8.0.16</version > </dependency > <dependency > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-starter-jdbc</artifactId > </dependency > </dependencies > </project >
图像存储 实现图片上传功能,以供其他服务使用
图片存储解决方案的分析 阿里云OSS存储方案的实现 本地存储方案的实现 开发一个图片上传服务,需要有存储的支持,那么我们的解决方案有以下几种 :
直接将图片保存到服务的硬盘 使用分布式文件系统进行存储 使用第三方的存储服务 PicUploadResult(VO) vo:服务提供方封装的返回数据
1 2 3 4 5 6 7 8 9 10 11 12 13 14 package com.haoke.api.vo;import lombok.Data;@Data public class PicUploadResult { private String fid; private String fname; private String status; private String response; }
本地文件系统存储 1. 编写Service PicUploadFileSystemService
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 package com.haoke.api.service;@Service public class PicUploadFileSystemService { private static final String[] IMAGE_TYPE = new String []{ ".bmp" , ".jpg" ,".jpeg" , ".gif" , ".png" }; public PicUploadResult upload (MultipartFile uploadFile) { boolean isLegal = false ; for (String type : IMAGE_TYPE) { if (StringUtils.endsWithIgnoreCase( uploadFile.getOriginalFilename(), type)) { isLegal = true ; break ; } } PicUploadResult fileUploadResult = new PicUploadResult (); if (!isLegal) { fileUploadResult.setStatus("error" ); return fileUploadResult; } String fileName = uploadFile.getOriginalFilename(); String filePath = getFilePath(fileName); String picUrl = StringUtils.replace( StringUtils.substringAfter(filePath,"F:\\haoke-upload" ), "\\" , "/" ); fileUploadResult.setName("http://image.haoke.com" + picUrl); File newFile = new File (filePath); try { uploadFile.transferTo(newFile); } catch (IOException e) { e.printStackTrace(); fileUploadResult.setStatus("error" ); return fileUploadResult; } fileUploadResult.setStatus("done" ); fileUploadResult.setUid(String.valueOf(System.currentTimeMillis())); return fileUploadResult; } private String getFilePath (String sourceFileName) { String baseFolder = "F:\\haoke-upload" + File.separator + "images" ; Date nowDate = new Date (); String fileFolder = baseFolder + File.separator + new DateTime (nowDate).toString("yyyy" ) + File.separator + new DateTime (nowDate).toString("MM" ) + File.separator + new DateTime (nowDate).toString("dd" ); File file = new File (fileFolder); if (!file.isDirectory()) { file.mkdirs(); } String fileName = new DateTime (nowDate).toString("yyyyMMddhhmmssSSSS" ) + RandomUtils.nextInt(100 , 9999 ) + "." + StringUtils.substringAfterLast(sourceFileName, "." ); return fileFolder + File.separator + fileName; } }
2. 修改Controller中的引用 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 package com.haoke.api.controller;@RequestMapping("pic/upload") @Controller public class PicUploadController { @Autowired private PicUploadFileSystemService picUploadService; @PostMapping @ResponseBody public PicUploadResult upload (@RequestParam("file") MultipartFile uploadFile) throws Exception { return this .picUploadService.upload(uploadFile); } }
3. 启动服务,测试接口
生成的链接是url链接,需要通过nginx进行访问映射
4. 搭建nginx进行访问图片 修改nginx配置文件 nginx目录/conf/nginx.conf
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 server { listen 80; server_name image.haoke.com; #charset koi8-r; #access_log logs/host.access.log main; proxy_set_header X-Forwarded-Host $host; proxy_set_header X-Forwarded-Server $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; location / { root E:\idea\graduateProject\code\upload; } }
修改本机hosts环境 没有修改C:\Windows\System32\drivers\etc\hosts的权限
打开C:\Windows\System32\drivers\etc\文件,找到hosts,然后分配所有权限。
找到上面提的hosts文件,打开文件直接再保存原来路径下
SwitchHosts!文件右击鼠标,点击管理员启动。
1 2 3 # 开发环境 127.0.0.1 manage.haoke.com 127.0.0.1 image.haoke.com
启动nginx 1 2 3 4 5 6 cd nginx目录 start nginx.exe # 停止nginx nginx -s stop
5. 访问链接,测试
腾讯云COS 在腾讯云 对象存储控制台 开通腾讯云对象存储(COS)服务。 在腾讯云 对象存储控制台 创建一个 Bucket。 在访问管理控制台中的 API 密钥管理 页面里获取 APPID,并创建 SecretId、SecretKey。
1. 导入依赖 1 2 3 4 5 6 7 8 9 10 11 12 <dependency > <groupId > com.qcloud</groupId > <artifactId > cos_api</artifactId > <version > 5.6.37</version > </dependency > <dependency > <groupId > org.springframework.boot </groupId > <artifactId > spring-boot-configuration-processor </artifactId > <optional > true </optional > </dependency >
2. 编写配置文件 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 tencent.cos.appid =100018187662 tencent.cos.secret-id =AKIDlNkW6P8TLwNZXExxg9MIcaQMSKwHn32l tencent.cos.secret-key =b3YEbhkX5MNK2VhfNULXZKG3cwzrW2HH tencent.cos.bucket-name =haoke-1257323542 tencent.cos.region-id =ap-beijing tencent.cos.base-url =https://haoke-1257323542.cos.ap-beijing.myqcloud.com
3. CosConfig 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 package com.haoke.api.config;import com.qcloud.cos.COSClient;import com.qcloud.cos.ClientConfig;import com.qcloud.cos.auth.BasicCOSCredentials;import com.qcloud.cos.auth.COSCredentials;import com.qcloud.cos.region.Region;import lombok.Data;import org.springframework.boot.context.properties.ConfigurationProperties;import org.springframework.context.annotation.Bean;import org.springframework.context.annotation.Configuration;import org.springframework.context.annotation.PropertySource;@Data @Configuration @PropertySource(value = {"classpath:tencent.properties"}) @ConfigurationProperties(prefix = "tencent.cos") public class CosConfig { private String appId; private String secretId; private String secretKey; private String bucketName; private String regionId; private String baseUrl; @Bean public COSClient cosClient () { COSCredentials cred = new BasicCOSCredentials (this .secretId, this .secretKey); ClientConfig clientConfig = new ClientConfig (new Region (this .regionId)); COSClient cosClient = new COSClient (cred,clientConfig); return cosClient; } }
4. 编写Service PicUploadTencentService
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 package com.haoke.api.service;@Service public class PicUploadTencentService { private static final String[] IMAGE_TYPE = new String [] {".bmp" , ".jpg" ,".jpeg" , ".gif" , ".png" }; @Autowired private COSClient cosClient; @Autowired private CosConfig cosConfig; public PicUploadResult upload (MultipartFile uploadFile) { boolean isLegal = false ; for (String type : IMAGE_TYPE) { if (StringUtils.endsWithIgnoreCase( uploadFile.getOriginalFilename(),type)) { isLegal = true ; break ; } } PicUploadResult fileUploadResult = new PicUploadResult (); if (!isLegal){ fileUploadResult.setStatus("error" ); return fileUploadResult; } String fileName = uploadFile.getOriginalFilename(); String filePath = getFilePath(fileName); String[] filename = filePath.split("\\." ); File localFile = null ; try { localFile=File.createTempFile(filename[0 ], filename[1 ]); uploadFile.transferTo(localFile); localFile.deleteOnExit(); cosClient.putObject( cosConfig.getBucketName(), filePath, localFile ); } catch (Exception e) { e.printStackTrace(); fileUploadResult.setStatus("error" ); return fileUploadResult; } cosClient.shutdown(); fileUploadResult.setStatus("done" ); fileUploadResult.setName(this .cosConfig.getBaseUrl() + filePath); fileUploadResult.setUid(String.valueOf(System.currentTimeMillis())); return fileUploadResult; } private String getFilePath (String fileName) { DateTime dateTime = new DateTime (); return "images/" + dateTime.toString("yyyy" )+ "/" + dateTime.toString("MM" ) + "/" + dateTime.toString("dd" ) + "/" + System.currentTimeMillis() + RandomUtils.nextInt(100 , 9999 ) + "." + StringUtils.substringAfterLast(fileName, "." ); } }
5. Controller 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 package com.haoke.api.controller;@RequestMapping("pic/upload") @Controller public class PicUploadController { @Autowired private PicUploadTencentService picUploadTencentService; @PostMapping @ResponseBody public PicUploadResult upload (@RequestParam("file") MultipartFile uploadFile) throws Exception { return this .picUploadTencentService.upload(uploadFile); } }
图像上传前后端整合 1. 修改PicturesWall.js 1 2 3 4 5 6 7 <Upload action="/haoke/pic/upload" listType="picture-card" fileList={fileList} onPreview={this .handlePreview } onChange={this .handleChange } >
2. 修改AddResources.js 1 2 3 4 5 6 7 8 9 10 11 handleFileList = (obj )=> { const pics = new Set (); obj.forEach ((v, k ) => { if (v.response ){ pics.add (v.response .name ); } }); this .setState ({ pics }) }
修改表单提交逻辑中的图片数据 1 values.pic = [...this .state .pics ].join (',' );
测试
为接口添加缓存功能 缓存分析 docker搭建redis集群 为接口添加缓存逻辑 拦截器放行特定请求 缓存分析 在接口服务中,每一次都进行数据库查询,那么必然会给数据库造成很大的并发压力。所以需要为接口添加缓存
项目使用 Redis ,并且使用Redis的集群,Api使用 Spring-Data-Redis
所以缓存逻辑应该加载api服务处
Docker搭建Redis集群
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 # 拉取镜像 docker pull redis # 创建容器 docker create --name redis-node01 -v /data/redis-data/node01:/data -p 6379:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-01.conf docker create --name redis-node02 -v /data/redis-data/node02:/data -p 6380:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-02.conf docker create --name redis-node03 -v /data/redis-data/node03:/data -p 6381:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-03.conf # 启动容器 docker start redis-node01 redis-node02 redis-node03 # 开始组建集群 # 进入redis-node01进行操作 docker exec -it redis-node01 /bin/bash # 组建集群 redis-cli --cluster create 172.17.0.1:6379 172.17.0.1:6380 172.17.0.1:6381 --cluster-replicas 0
组建集群出现连接不到redis节点问题
但可以单独连接到每个节点
使用redis-docker容器的ip地址 172.17.0.1 是主机分配给docker容器的地址
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 # 停止容器 docker stop redis-node01 redis-node02 redis-node03 # 删除容器 docker rm redis-node01 redis-node02 redis-node03 # 删除redis目录 rm -rf /data/redis-data # 创建容器 docker create --name redis-node01 -v /data/redis-data/node01:/data -p 6379:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-01.conf docker create --name redis-node02 -v /data/redis-data/node02:/data -p 6380:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-02.conf docker create --name redis-node03 -v /data/redis-data/node03:/data -p 6381:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-03.conf # 启动容器 docker start redis-node01 redis-node02 redis-node03 # 查看容器的ip地址 docker inspect redis-node01 -> 172.17.0.3 docker inspect redis-node02 -> 172.17.0.5 docker inspect redis-node03 -> 172.17.0.6 # 进入redis-node01进行操作 docker exec -it redis-node01 /bin/bash # 组建集群(注意端口的变化) redis-cli --cluster create 172.17.0.3:6379 172.17.0.4:6379 172.17.0.5:6379 --cluster-replicas 0
集群组建成功:
查看集群信息
1 2 root@03adb7fdf0e0:/data# redis-cli 127.0.0.1:6379> CLUSTER NODES
存在的问题 集群中结点的ip地址是docker容器分配的,在客户端无法访问
解决方案:使用host网络进行集群搭建 docker网络类型
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 # 停止容器 docker stop redis-node01 redis-node02 redis-node03 # 删除容器 docker rm redis-node01 redis-node02 redis-node03 # 删除redis目录 rm -rf /data/redis-data # 创建容器 docker create --name redis-node01 --net host -v /data/redis-data/node01:/data redis --cluster-enabled yes --cluster-announce-ip 8.140.130.91 --cluster-announce-bus-port 16379 --cluster-config-file nodes-node-01.conf --port 6379 docker create --name redis-node02 --net host -v /data/redis-data/node02:/data redis --cluster-enabled yes --cluster-announce-ip 8.140.130.91 --cluster-announce-bus-port 16380 --cluster-config-file nodes-node-02.conf --port 6380 docker create --name redis-node03 --net host -v /data/redis-data/node03:/data redis --cluster-enabled yes --cluster-announce-ip 8.140.130.91 --cluster-announce-bus-port 16381 --cluster-config-file nodes-node-03.conf --port 6381 # 启动容器 docker start redis-node01 redis-node02 redis-node03 # 进入redis-node01容器进行操作 docker exec -it redis-node01 /bin/bash # 8.140.130.91是主机的ip地址 redis-cli --cluster create 8.140.130.91:6379 8.140.130.91:6380 8.140.130.91:6381 --cluster-replicas 0
—name:容器名
-v /data/redis-data/node01:/data:容器配置文件映射到本机路径
-p 6380:6379:端口映射
—cluster-enabled yes:启用集群
—cluster-config-file nodes-node-01.conf:本节点配置文件
—cluster-announce-ip 8.140.130.91:集群公网ip
—cluster-announce-bus-port 16379: 集群的总线端口
若不设置集群公网Ip及总线端口,则会出现
JedisClusterMaxAttemptsException: No more cluster attempts left.
查看集群信息
测试集群
测试集群 1.导入依赖 1 2 3 4 5 6 7 8 9 10 11 12 13 14 <dependency > <groupId > org.springframework.boot</groupId > <artifactId > spring-boot-starter-data-redis</artifactId > </dependency > <dependency > <groupId > redis.clients</groupId > <artifactId > jedis</artifactId > </dependency > <dependency > <groupId > commons-io</groupId > <artifactId > commons-io</artifactId > <version > 2.6</version > </dependency >
2. redis配置文件 1 2 3 4 5 6 7 spring.redis.jedis.pool.max-wait = 5000 spring.redis.jedis.pool.max-Idle = 100 spring.redis.jedis.pool.min-Idle = 10 spring.redis.timeout = 10 spring.redis.cluster.nodes = 8.140.130.91:6379,8.140.130.91:6380,8.140.130.91:6381 spring.redis.cluster.max-redirects =5
3. Bean注入 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 package com.haoke.api.config;import lombok.Data;import org.springframework.boot.context.properties.ConfigurationProperties;import org.springframework.context.annotation.PropertySource;import org.springframework.stereotype.Component;import java.util.List;@PropertySource(value = "classpath:application.properties") @ConfigurationProperties(prefix = "spring.redis.cluster") @Component @Data public class ClusterConfigurationProperties { private List<String> nodes; private Integer maxRedirects; }
4. 注册Redis连接工厂 https://blog.csdn.net/qq_40091033/article/details/106682199
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 package com.haoke.api.config;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.context.annotation.Bean;import org.springframework.context.annotation.Configuration;import org.springframework.data.redis.connection.RedisClusterConfiguration;import org.springframework.data.redis.connection.RedisConnectionFactory;import org.springframework.data.redis.connection.jedis.JedisConnectionFactory;import org.springframework.data.redis.core.RedisTemplate;import org.springframework.data.redis.serializer.StringRedisSerializer;@Configuration public class RedisClusterConfig { @Autowired private ClusterConfigurationProperties clusterProperties; @Bean public RedisConnectionFactory connectionFactory () { RedisClusterConfiguration redisClusterConfiguration = new RedisClusterConfiguration (clusterProperties.getNodes()); redisClusterConfiguration.setMaxRedirects(clusterProperties.getMaxRedirects()); return new JedisConnectionFactory (redisClusterConfiguration); } @Bean public RedisTemplate<String,String> redisTemplate (RedisConnectionFactory redisConnectionFactory) { RedisTemplate<String, String> redisTemplate = new RedisTemplate <>(); redisTemplate.setConnectionFactory(redisConnectionFactory); redisTemplate.setKeySerializer(new StringRedisSerializer ()); redisTemplate.setValueSerializer(new StringRedisSerializer ()); redisTemplate.afterPropertiesSet(); return redisTemplate; } }
测试类 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 package com.haoke.api;import org.junit.Test;import org.junit.runner.RunWith;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.boot.autoconfigure.SpringBootApplication;import org.springframework.boot.test.context.SpringBootTest;import org.springframework.data.redis.core.RedisTemplate;import org.springframework.test.context.junit4.SpringRunner;import redis.clients.jedis.HostAndPort;import redis.clients.jedis.Jedis;import redis.clients.jedis.JedisCluster;import java.util.Set;@RunWith(SpringRunner.class) @SpringBootTest public class TestRedis { @Autowired private RedisTemplate redisTemplate; @Test public void testSave () { for (int i = 0 ; i < 100 ; i++) { this .redisTemplate.opsForValue().set("key_" + i, "value_" +i); } Set<String> keys = this .redisTemplate.keys("key_*" ); for (String key : keys) { String value = (String) this .redisTemplate.opsForValue().get(key); System.out.println(value); } } }
1 2 3 4 # 进入redis-node01容器进行操作 docker exec -it redis-node01 /bin/bash redis-cli -c -p 6379
添加缓存逻辑 实现缓存有2种方式:
每个接口单独控制缓存逻辑 统一控制缓存逻辑 判断redis缓存是否命中 :若是POST请求,需从输入流中读取返回的数据,然而redis未命中,放行请求后,由于输入流只能读取一次,输入流已被销毁,无法读取到请求参数,查询不到数据,所以需要使用包装request解决多次读取输入流中的数据
生成redis缓存 :读取返回的结果,由AOP思想,处理控制器返回的请求
判断redis是否命中 采取拦截器进行缓存命中 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 package com.haoke.api.interceptor;import com.fasterxml.jackson.databind.ObjectMapper;import org.apache.commons.codec.digest.DigestUtils;import org.apache.commons.io.IOUtils;import org.apache.commons.lang3.StringUtils;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.data.redis.core.RedisTemplate;import org.springframework.stereotype.Component;import org.springframework.web.servlet.HandlerInterceptor;import javax.servlet.http.HttpServletRequest;import javax.servlet.http.HttpServletResponse;import java.io.IOException;import java.util.Map;@Component public class RedisInterceptor implements HandlerInterceptor { @Autowired private RedisTemplate<String,String> redisTemplate; private static ObjectMapper mapper = new ObjectMapper (); @Override public boolean preHandle (HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception { if (!StringUtils.equalsIgnoreCase(request.getMethod(),"GET" )){ if (!StringUtils.equalsIgnoreCase(request.getRequestURI(),"/graphql" )) return true ; } String data = this .redisTemplate.opsForValue().get(createRedisKey(request)); if (StringUtils.isEmpty(data)){ return true ; } response.setCharacterEncoding("UTF-8" ); response.setContentType("application/json; charset=utf-8" ); response.getWriter().write(data); return false ; } public static String createRedisKey (HttpServletRequest request) throws IOException { String paramStr = request.getRequestURI(); Map<String,String[]> parameterMap = request.getParameterMap(); if (parameterMap.isEmpty()){ paramStr += IOUtils.toString(request.getInputStream(),"UTF-8" ); # 对于POST请求,由于在拦截器中读取了输入流中的数据,在request中的输入流只能读取一次,请求进去的Controller时,输入流已关闭,导致获取不到数据 }else { paramStr += mapper.writeValueAsString(request.getParameterMap()); } String redisKey = "WEB_DATA_" + DigestUtils.md5Hex(paramStr); return redisKey; } }
将拦截器注册到WEB容器中 1 2 3 4 5 6 7 8 9 10 11 12 13 @Configuration public class WebConfig implements WebMvcConfigurer { @Autowired private RedisInterceptor redisInterceptor; @Override public void addInterceptors (InterceptorRegistry registry) { registry.addInterceptor(redisInterceptor).addPathPatterns("/**" ); } }
测试拦截器
1 paramStr = "/graphql{"query":"query HouseResourcesList($pageSize: Int, $page: Int) {\n HouseResourcesList(pageSize: $pageSize, page: $page) {\n list {\n id\n title\n pic\n title\n coveredArea\n orientation\n floor\n rent\n }\n }\n}","variables":{"pageSize":2,"page":1},"operationName":"HouseResourcesList"}"
参数串形成的md5为
1 redisKey = WEB_DATA_822d7e70c286f68877cb6759b04498d4
由于Redis中没有键为 redisKey ,所以获取到的 data = null
然而,由于在拦截器中读取了输入流的数据,在request中的输入流只能读取一次,请求进去Controller时,输入流中已经没有数据了,导致获取不到数据。
通过包装request解决 对HttpServetRequest进行包装
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 package com.haoke.api.interceptor;import org.apache.commons.io.IOUtils;import javax.servlet.ReadListener;import javax.servlet.ServletInputStream;import javax.servlet.http.HttpServletRequest;import javax.servlet.http.HttpServletRequestWrapper;import java.io.BufferedReader;import java.io.IOException;import java.io.InputStreamReader;public class MyServletRequestWrapper extends HttpServletRequestWrapper { private final byte [] body; public MyServletRequestWrapper (HttpServletRequest request) throws IOException { super (request); body = IOUtils.toByteArray(super .getInputStream()); } @Override public BufferedReader getReader () throws IOException { return new BufferedReader (new InputStreamReader (getInputStream())); } @Override public ServletInputStream getInputStream () throws IOException { return new RequestBodyCachingInputStream (body); } private class RequestBodyCachingInputStream extends ServletInputStream { private byte [] body; private int lastIndexRetrieved = -1 ; private ReadListener listener; public RequestBodyCachingInputStream (byte [] body) { this .body = body; } @Override public int read () throws IOException { if (isFinished()) { return -1 ; } int i = body[lastIndexRetrieved + 1 ]; lastIndexRetrieved++; if (isFinished() && listener != null ) { try { listener.onAllDataRead(); } catch (IOException e) { listener.onError(e); throw e; } } return i; } @Override public boolean isFinished () { return lastIndexRetrieved == body.length - 1 ; } @Override public boolean isReady () { return isFinished(); } @Override public void setReadListener (ReadListener readListener) { if (listener == null ) { throw new IllegalArgumentException ("listener cann not be null" ); } if (this .listener != null ) { throw new IllegalArgumentException ("listener has been set" ); } this .listener = listener; if (!isFinished()) { try { listener.onAllDataRead(); } catch (IOException e) { listener.onError(e); } } else { try { listener.onAllDataRead(); } catch (IOException e) { listener.onError(e); } } } @Override public int available () throws IOException { return body.length - lastIndexRetrieved - 1 ; } @Override public void close () throws IOException { lastIndexRetrieved = body.length - 1 ; body = null ; } } }
使用过滤器对请求进行替换
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 package com.haoke.api.interceptor;import org.springframework.stereotype.Component;import org.springframework.web.filter.OncePerRequestFilter;import javax.servlet.FilterChain;import javax.servlet.ServletException;import javax.servlet.http.HttpServletRequest;import javax.servlet.http.HttpServletResponse;import java.io.IOException;@Component public class RequestReplaceFilter extends OncePerRequestFilter { @Override protected void doFilterInternal (HttpServletRequest request, HttpServletResponse response, FilterChain filterChain) throws ServletException, IOException { if (!(request instanceof MyServletRequestWrapper)) { request = new MyServletRequestWrapper (request); } filterChain.doFilter(request, response); } }
可见,request已经替换成了自定义的wrapper
进入 GraphQLControll
,发现此时请求中包含数据
生成缓存 不能通过拦截器生成缓存,因为拦截器中拿不到Controller返回的数据
通过Spring AOP实现,在结果被处理前进行拦截,拦截的逻辑自己实现,这样就可以实现拿到结果数据进行写入缓存的操作了。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 package com.haoke.api.interceptor;import com.fasterxml.jackson.databind.ObjectMapper;import com.haoke.api.controller.GraphQLController;import org.apache.commons.lang3.StringUtils;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.core.MethodParameter;import org.springframework.data.redis.core.RedisTemplate;import org.springframework.http.MediaType;import org.springframework.http.server.ServerHttpRequest;import org.springframework.http.server.ServerHttpResponse;import org.springframework.http.server.ServletServerHttpRequest;import org.springframework.web.bind.annotation.ControllerAdvice;import org.springframework.web.bind.annotation.GetMapping;import org.springframework.web.bind.annotation.PostMapping;import org.springframework.web.servlet.mvc.method.annotation.ResponseBodyAdvice;import java.time.Duration;@ControllerAdvice public class MyResponseBodyAdvice implements ResponseBodyAdvice { @Autowired private RedisTemplate redisTemplate; private ObjectMapper mapper = new ObjectMapper (); @Override public boolean supports (MethodParameter returnType, Class converterType) { if (returnType.hasMethodAnnotation(GetMapping.class)){ return true ; } if (returnType.hasMethodAnnotation(PostMapping.class) && StringUtils.equals(GraphQLController.class.getName(),returnType.getExecutable().getDeclaringClass().getName())){ return true ; } return false ; } @Override public Object beforeBodyWrite (Object body, MethodParameter returnType, MediaType selectedContentType, Class selectedConverterType, ServerHttpRequest request, ServerHttpResponse response) { try { String redisKey = RedisInterceptor.createRedisKey(((ServletServerHttpRequest) request).getServletRequest()); String redisValue; if (body instanceof String){ redisValue = (String) body; }else { redisValue = mapper.writeValueAsString(body); } this .redisTemplate.opsForValue().set(redisKey,redisValue,Duration.ofHours(1 )); }catch (Exception e){ e.getStackTrace(); } return body; } }
测试 第二次查询同一数据,发现命中
查看服务器端集群
整合前端 命中时跨域处理 mock的数据都是OPTIONS请求,不能拦截
讲解中有这部分内容,但我的mock都是GET请求,所以没有这种情况,仅做记录
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 package com.haoke.api.interceptor;import com.fasterxml.jackson.databind.ObjectMapper;import org.apache.commons.codec.digest.DigestUtils;import org.apache.commons.io.IOUtils;import org.apache.commons.lang3.StringUtils;import org.springframework.beans.factory.annotation.Autowired;import org.springframework.data.redis.core.RedisTemplate;import org.springframework.stereotype.Component;import org.springframework.web.servlet.HandlerInterceptor;import javax.servlet.http.HttpServletRequest;import javax.servlet.http.HttpServletResponse;import java.io.IOException;import java.util.Map;@Component public class RedisInterceptor implements HandlerInterceptor { @Autowired private RedisTemplate<String,String> redisTemplate; private static ObjectMapper mapper = new ObjectMapper (); @Override public boolean preHandle (HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception { if (StringUtils.equalsIgnoreCase(request.getMethod(), "OPTIONS" )){ return true ; } if (request.getRequestURI().startsWith("/mock" )){ return true ; } if (!StringUtils.equalsIgnoreCase(request.getMethod(),"GET" )){ if (!StringUtils.equalsIgnoreCase(request.getRequestURI(),"/graphql" )) return true ; } String data = this .redisTemplate.opsForValue().get(createRedisKey(request)); if (StringUtils.isEmpty(data)){ return true ; } response.setCharacterEncoding("UTF-8" ); response.setContentType("application/json; charset=utf-8" ); response.setHeader("Access-Control-Allow-Origin" , "*" ); response.setHeader("Access-Control-Allow-Methods" , "GET,POST,PUT,DELETE,OPTIONS" ); response.setHeader("Access-Control-Allow-Credentials" , "true" ); response.setHeader("Access-Control-Allow-Headers" , "Content-Type,X-Token" ); response.setHeader("Access-Control-Allow-Credentials" , "true" ); response.getWriter().write(data); return false ; } public static String createRedisKey (HttpServletRequest request) throws IOException { String paramStr = request.getRequestURI(); Map<String,String[]> parameterMap = request.getParameterMap(); if (parameterMap.isEmpty()){ paramStr += IOUtils.toString(request.getInputStream(),"UTF-8" ); }else { paramStr += mapper.writeValueAsString(request.getParameterMap()); } String redisKey = "WEB_DATA_" + DigestUtils.md5Hex(paramStr); return redisKey; } }