提交 9456692a authored 作者: 陈世营's avatar 陈世营

Initial commit

上级
*.class
# Package Files #
*.jar
*.war
*.ear
*.versionsBackup
*.iml
.idea/
target/
logs/
log/
差异被折叠。
# Leaf
> There are no two identical leaves in the world.
>
> ​ — Leibnitz
[中文文档](./README_CN.md) | [English Document](./README.md)
## Introduction
Leaf refers to some common ID generation schemes in the industry, including redis, UUID, snowflake, etc.
Each of the above approaches has its own problems, so we decided to implement a set of distributed ID generation services to meet the requirements.
At present, Leaf covers Meituan review company's internal finance, catering, takeaway, hotel travel, cat's eye movie and many other business lines. On the basis of 4C8G VM, through the company RPC method, QPS pressure test results are nearly 5w/s, TP999 1ms.
You can use it to encapsulate a distributed unique id distribution center in a service-oriented SOA architecture as the id distribution provider for all applications
## Quick Start
### Leaf Server
Leaf provides an HTTP service based on spring boot to get the id
#### run Leaf Server
##### build
```shell
git clone git@github.com:Meituan-Dianping/Leaf.git
cd leaf
mvn clean install -DskipTests
cd leaf-server
```
##### run
###### maven
```shell
mvn spring-boot:run
```
or
###### shell command
```shell
sh deploy/run.sh
```
##### test
```shell
#segment
curl http://localhost:8080/api/segment/get/leaf-segment-test
#snowflake
curl http://localhost:8080/api/snowflake/get/test
```
#### Configuration
Leaf provides two ways to generate ids (segment mode and snowflake mode), which you can turn on at the same time or specify one way to turn on (both are off by default).
Leaf Server configuration is in the leaf-server/src/main/resources/leaf.properties
| configuration | meaning | default |
| ------------------------- | ----------------------------- | ------ |
| leaf.name | leaf service name | |
| leaf.segment.enable | whether segment mode is enabled | false |
| leaf.jdbc.url | mysql url | |
| leaf.jdbc.username | mysql username | |
| leaf.jdbc.password | mysql password | |
| leaf.snowflake.enable | whether snowflake mode is enabled | false |
| leaf.snowflake.zk.address | zk address under snowflake mode | |
| leaf.snowflake.port | service registration port under snowflake mode | |
### Segment mode
In order to use segment mode, you need to create DB table first, and configure leaf.jdbc.url, leaf.jdbc.username, leaf.jdbc.password
If you do not want use it, just configure leaf.segment.enable=false to disable it.
```sql
CREATE DATABASE leaf
CREATE TABLE `leaf_alloc` (
`biz_tag` varchar(128) NOT NULL DEFAULT '', -- your biz unique name
`max_id` bigint(20) NOT NULL DEFAULT '1',
`step` int(11) NOT NULL,
`description` varchar(256) DEFAULT NULL,
`update_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY (`biz_tag`)
) ENGINE=InnoDB;
insert into leaf_alloc(biz_tag, max_id, step, description) values('leaf-segment-test', 1, 2000, 'Test leaf Segment Mode Get Id')
```
### Snowflake mode
The algorithm is taken from twitter's open-source snowflake algorithm.
If you do not want to use it, just configure leaf.snowflake.enable=false to disable it.
Configure the zookeeper address
```
leaf.snowflake.zk.address=${address}
leaf.snowflake.enable=true
leaf.snowflake.port=${port}
```
configure leaf.snowflake.zk.address in the leaf.properties, and configure the leaf service listen port leaf.snowflake.port.
### monitor page
segment mode: http://localhost:8080/cache
### Leaf Core
Of course, in order to pursue higher performance, you need to deploy the Leaf service through RPC Server, which only needs to introduce the leaf-core package and encapsulate the API that generates the ID into the specified RPC framework.
#### Attention
Note that leaf's current IP acquisition logic in the case of snowflake mode takes the first network card IP directly (especially for services that change IP) to avoid wasting the workId
# Leaf
> There are no two identical leaves in the world.
>
> 世界上没有两片完全相同的树叶。
>
> ​ — 莱布尼茨
[中文文档](./README_CN.md) | [English Document](./README.md)
## Introduction
Leaf 最早期需求是各个业务线的订单ID生成需求。在美团早期,有的业务直接通过DB自增的方式生成ID,有的业务通过redis缓存来生成ID,也有的业务直接用UUID这种方式来生成ID。以上的方式各自有各自的问题,因此我们决定实现一套分布式ID生成服务来满足需求。具体Leaf 设计文档见:[ leaf 美团分布式ID生成服务 ](https://tech.meituan.com/MT_Leaf.html )
目前Leaf覆盖了美团点评公司内部金融、餐饮、外卖、酒店旅游、猫眼电影等众多业务线。在4C8G VM基础上,通过公司RPC方式调用,QPS压测结果近5w/s,TP999 1ms。
## Quick Start
### Leaf Server
我们提供了一个基于spring boot的HTTP服务来获取ID
#### 配置介绍
Leaf 提供两种生成的ID的方式(号段模式和snowflake模式),你可以同时开启两种方式,也可以指定开启某种方式(默认两种方式为关闭状态)。
Leaf Server的配置都在leaf-server/src/main/resources/leaf.properties中
| 配置项 | 含义 | 默认值 |
| ------------------------- | ----------------------------- | ------ |
| leaf.name | leaf 服务名 | |
| leaf.segment.enable | 是否开启号段模式 | false |
| leaf.jdbc.url | mysql 库地址 | |
| leaf.jdbc.username | mysql 用户名 | |
| leaf.jdbc.password | mysql 密码 | |
| leaf.snowflake.enable | 是否开启snowflake模式 | false |
| leaf.snowflake.zk.address | snowflake模式下的zk地址 | |
| leaf.snowflake.port | snowflake模式下的服务注册端口 | |
#### 号段模式
如果使用号段模式,需要建立DB表,并配置leaf.jdbc.url, leaf.jdbc.username, leaf.jdbc.password
如果不想使用该模式配置leaf.segment.enable=false即可。
##### 创建数据表
```sql
CREATE DATABASE leaf
CREATE TABLE `leaf_alloc` (
`biz_tag` varchar(128) NOT NULL DEFAULT '',
`max_id` bigint(20) NOT NULL DEFAULT '1',
`step` int(11) NOT NULL,
`description` varchar(256) DEFAULT NULL,
`update_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY (`biz_tag`)
) ENGINE=InnoDB;
insert into leaf_alloc(biz_tag, max_id, step, description) values('leaf-segment-test', 1, 2000, 'Test leaf Segment Mode Get Id')
```
##### 配置相关数据项
在leaf.properties中配置leaf.jdbc.url, leaf.jdbc.username, leaf.jdbc.password参数
#### Snowflake模式
算法取自twitter开源的snowflake算法。
如果不想使用该模式配置leaf.snowflake.enable=false即可。
##### 配置zookeeper地址
在leaf.properties中配置leaf.snowflake.zk.address,配置leaf 服务监听的端口leaf.snowflake.port。
#### 运行Leaf Server
##### 打包服务
```shell
git clone git@github.com:Meituan-Dianping/Leaf.git
//按照上面的号段模式在工程里面配置好
cd leaf
mvn clean install -DskipTests
cd leaf-server
```
##### 运行服务
*注意:首先得先配置好数据库表或者zk地址*
###### mvn方式
```shell
mvn spring-boot:run
```
###### 脚本方式
```shell
sh deploy/run.sh
```
##### 测试
```shell
#segment
curl http://localhost:8080/api/segment/get/leaf-segment-test
#snowflake
curl http://localhost:8080/api/snowflake/get/test
```
##### 监控页面
号段模式:http://localhost:8080/cache
### Leaf Core
当然,为了追求更高的性能,需要通过RPC Server来部署Leaf 服务,那仅需要引入leaf-core的包,把生成ID的API封装到指定的RPC框架中即可。
### 注意事项
注意现在leaf使用snowflake模式的情况下 其获取ip的逻辑直接取首个网卡ip【特别对于会更换ip的服务要注意】避免浪费workId
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.sankuai.inf.leaf</groupId>
<artifactId>leaf-parent</artifactId>
<version>1.0.1</version>
</parent>
<groupId>com.sankuai.inf.leaf</groupId>
<artifactId>leaf-core</artifactId>
<packaging>jar</packaging>
<version>1.0.1</version>
<name>leaf-core</name>
<properties>
<mysql-connector-java.version>5.1.38</mysql-connector-java.version>
<commons-io.version>2.4</commons-io.version>
<log4j.version>2.7</log4j.version>
<mybatis-spring.version>1.2.5</mybatis-spring.version>
</properties>
<dependencies>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis</artifactId>
</dependency>
<dependency>
<groupId>org.perf4j</groupId>
<artifactId>perf4j</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</dependency>
<!--zk-->
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-recipes</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson-databind.version}</version>
<scope>provided</scope>
</dependency>
<!-- test scope -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis-spring</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
</project>
package com.sankuai.inf.leaf;
import com.sankuai.inf.leaf.common.Result;
public interface IDGen {
Result get(String key);
boolean init();
}
package com.sankuai.inf.leaf.common;
public class CheckVO {
private long timestamp;
private int workID;
public CheckVO(long timestamp, int workID) {
this.timestamp = timestamp;
this.workID = workID;
}
public long getTimestamp() {
return timestamp;
}
public void setTimestamp(long timestamp) {
this.timestamp = timestamp;
}
public int getWorkID() {
return workID;
}
public void setWorkID(int workID) {
this.workID = workID;
}
}
package com.sankuai.inf.leaf.common;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Properties;
public class PropertyFactory {
private static final Logger logger = LoggerFactory.getLogger(PropertyFactory.class);
private static final Properties prop = new Properties();
static {
try {
prop.load(PropertyFactory.class.getClassLoader().getResourceAsStream("leaf.properties"));
} catch (Exception e) {
logger.warn("Load leaf.properties Properties Exception");
}
}
public static Properties getProperties() {
return prop;
}
}
package com.sankuai.inf.leaf.common;
public class Result {
private long id;
private Status status;
public Result() {
}
public Result(long id, Status status) {
this.id = id;
this.status = status;
}
public long getId() {
return id;
}
public void setId(long id) {
this.id = id;
}
public Status getStatus() {
return status;
}
public void setStatus(Status status) {
this.status = status;
}
@Override
public String toString() {
final StringBuilder sb = new StringBuilder("Result{");
sb.append("id=").append(id);
sb.append(", status=").append(status);
sb.append('}');
return sb.toString();
}
}
package com.sankuai.inf.leaf.common;
public enum Status {
SUCCESS,
EXCEPTION
}
package com.sankuai.inf.leaf.common;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.net.Inet6Address;
import java.net.InetAddress;
import java.net.NetworkInterface;
import java.net.SocketException;
import java.util.ArrayList;
import java.util.Enumeration;
import java.util.List;
public class Utils {
private static final Logger logger = LoggerFactory.getLogger(Utils.class);
public static String getIp() {
String ip;
try {
List<String> ipList = getHostAddress(null);
// default the first
ip = (!ipList.isEmpty()) ? ipList.get(0) : "";
} catch (Exception ex) {
ip = "";
logger.warn("Utils get IP warn", ex);
}
return ip;
}
public static String getIp(String interfaceName) {
String ip;
interfaceName = interfaceName.trim();
try {
List<String> ipList = getHostAddress(interfaceName);
ip = (!ipList.isEmpty()) ? ipList.get(0) : "";
} catch (Exception ex) {
ip = "";
logger.warn("Utils get IP warn", ex);
}
return ip;
}
/**
* 获取已激活网卡的IP地址
*
* @param interfaceName 可指定网卡名称,null则获取全部
* @return List<String>
*/
private static List<String> getHostAddress(String interfaceName) throws SocketException {
List<String> ipList = new ArrayList<String>(5);
Enumeration<NetworkInterface> interfaces = NetworkInterface.getNetworkInterfaces();
while (interfaces.hasMoreElements()) {
NetworkInterface ni = interfaces.nextElement();
Enumeration<InetAddress> allAddress = ni.getInetAddresses();
while (allAddress.hasMoreElements()) {
InetAddress address = allAddress.nextElement();
if (address.isLoopbackAddress()) {
// skip the loopback addr
continue;
}
if (address instanceof Inet6Address) {
// skip the IPv6 addr
continue;
}
String hostAddress = address.getHostAddress();
if (null == interfaceName) {
ipList.add(hostAddress);
} else if (interfaceName.equals(ni.getDisplayName())) {
ipList.add(hostAddress);
}
}
}
return ipList;
}
}
package com.sankuai.inf.leaf.common;
import com.sankuai.inf.leaf.IDGen;
public class ZeroIDGen implements IDGen {
@Override
public Result get(String key) {
return new Result(0, Status.SUCCESS);
}
@Override
public boolean init() {
return true;
}
}
package com.sankuai.inf.leaf.segment.dao;
import com.sankuai.inf.leaf.segment.model.LeafAlloc;
import java.util.List;
public interface IDAllocDao {
List<LeafAlloc> getAllLeafAllocs();
LeafAlloc updateMaxIdAndGetLeafAlloc(String tag);
LeafAlloc updateMaxIdByCustomStepAndGetLeafAlloc(LeafAlloc leafAlloc);
List<String> getAllTags();
}
package com.sankuai.inf.leaf.segment.dao;
import com.sankuai.inf.leaf.segment.model.LeafAlloc;
import org.apache.ibatis.annotations.*;
import java.util.List;
public interface IDAllocMapper {
@Select("SELECT biz_tag, max_id, step, update_time FROM leaf_alloc")
@Results(value = {
@Result(column = "biz_tag", property = "key"),
@Result(column = "max_id", property = "maxId"),
@Result(column = "step", property = "step"),
@Result(column = "update_time", property = "updateTime")
})
List<LeafAlloc> getAllLeafAllocs();
@Select("SELECT biz_tag, max_id, step FROM leaf_alloc WHERE biz_tag = #{tag}")
@Results(value = {
@Result(column = "biz_tag", property = "key"),
@Result(column = "max_id", property = "maxId"),
@Result(column = "step", property = "step")
})
LeafAlloc getLeafAlloc(@Param("tag") String tag);
@Update("UPDATE leaf_alloc SET max_id = max_id + step WHERE biz_tag = #{tag}")
void updateMaxId(@Param("tag") String tag);
@Update("UPDATE leaf_alloc SET max_id = max_id + #{step} WHERE biz_tag = #{key}")
void updateMaxIdByCustomStep(@Param("leafAlloc") LeafAlloc leafAlloc);
@Select("SELECT biz_tag FROM leaf_alloc")
List<String> getAllTags();
}
package com.sankuai.inf.leaf.segment.dao.impl;
import com.sankuai.inf.leaf.segment.dao.IDAllocDao;
import com.sankuai.inf.leaf.segment.dao.IDAllocMapper;
import com.sankuai.inf.leaf.segment.model.LeafAlloc;
import org.apache.ibatis.mapping.Environment;
import org.apache.ibatis.session.Configuration;
import org.apache.ibatis.session.SqlSession;
import org.apache.ibatis.session.SqlSessionFactory;
import org.apache.ibatis.session.SqlSessionFactoryBuilder;
import org.apache.ibatis.transaction.TransactionFactory;
import org.apache.ibatis.transaction.jdbc.JdbcTransactionFactory;
import javax.sql.DataSource;
import java.util.List;
public class IDAllocDaoImpl implements IDAllocDao {
SqlSessionFactory sqlSessionFactory;
public IDAllocDaoImpl(DataSource dataSource) {
TransactionFactory transactionFactory = new JdbcTransactionFactory();
Environment environment = new Environment("development", transactionFactory, dataSource);
Configuration configuration = new Configuration(environment);
configuration.addMapper(IDAllocMapper.class);
sqlSessionFactory = new SqlSessionFactoryBuilder().build(configuration);
}
@Override
public List<LeafAlloc> getAllLeafAllocs() {
SqlSession sqlSession = sqlSessionFactory.openSession(false);
try {
return sqlSession.selectList("com.sankuai.inf.leaf.segment.dao.IDAllocMapper.getAllLeafAllocs");
} finally {
sqlSession.close();
}
}
@Override
public LeafAlloc updateMaxIdAndGetLeafAlloc(String tag) {
SqlSession sqlSession = sqlSessionFactory.openSession();
try {
sqlSession.update("com.sankuai.inf.leaf.segment.dao.IDAllocMapper.updateMaxId", tag);
LeafAlloc result = sqlSession.selectOne("com.sankuai.inf.leaf.segment.dao.IDAllocMapper.getLeafAlloc", tag);
sqlSession.commit();
return result;
} finally {
sqlSession.close();
}
}
@Override
public LeafAlloc updateMaxIdByCustomStepAndGetLeafAlloc(LeafAlloc leafAlloc) {
SqlSession sqlSession = sqlSessionFactory.openSession();
try {
sqlSession.update("com.sankuai.inf.leaf.segment.dao.IDAllocMapper.updateMaxIdByCustomStep", leafAlloc);
LeafAlloc result = sqlSession.selectOne("com.sankuai.inf.leaf.segment.dao.IDAllocMapper.getLeafAlloc", leafAlloc.getKey());
sqlSession.commit();
return result;
} finally {
sqlSession.close();
}
}
@Override
public List<String> getAllTags() {
SqlSession sqlSession = sqlSessionFactory.openSession(false);
try {
return sqlSession.selectList("com.sankuai.inf.leaf.segment.dao.IDAllocMapper.getAllTags");
} finally {
sqlSession.close();
}
}
}
package com.sankuai.inf.leaf.segment.model;
public class LeafAlloc {
private String key;
private long maxId;
private int step;
private String updateTime;
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public long getMaxId() {
return maxId;
}
public void setMaxId(long maxId) {
this.maxId = maxId;
}
public int getStep() {
return step;
}
public void setStep(int step) {
this.step = step;
}
public String getUpdateTime() {
return updateTime;
}
public void setUpdateTime(String updateTime) {
this.updateTime = updateTime;
}
}
package com.sankuai.inf.leaf.segment.model;
import java.util.concurrent.atomic.AtomicLong;
public class Segment {
private AtomicLong value = new AtomicLong(0);
private volatile long max;
private volatile int step;
private SegmentBuffer buffer;
public Segment(SegmentBuffer buffer) {
this.buffer = buffer;
}
public AtomicLong getValue() {
return value;
}
public void setValue(AtomicLong value) {
this.value = value;
}
public long getMax() {
return max;
}
public void setMax(long max) {
this.max = max;
}
public int getStep() {
return step;
}
public void setStep(int step) {
this.step = step;
}
public SegmentBuffer getBuffer() {
return buffer;
}
public long getIdle() {
return this.getMax() - getValue().get();
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder("Segment(");
sb.append("value:");
sb.append(value);
sb.append(",max:");
sb.append(max);
sb.append(",step:");
sb.append(step);
sb.append(")");
return sb.toString();
}
}
package com.sankuai.inf.leaf.segment.model;
import java.util.Arrays;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.locks.Lock;
import java.util.concurrent.locks.ReadWriteLock;
import java.util.concurrent.locks.ReentrantReadWriteLock;
/**
* 双buffer
*/
public class SegmentBuffer {
private String key;
private Segment[] segments; //双buffer
private volatile int currentPos; //当前的使用的segment的index
private volatile boolean nextReady; //下一个segment是否处于可切换状态
private volatile boolean initOk; //是否初始化完成
private final AtomicBoolean threadRunning; //线程是否在运行中
private final ReadWriteLock lock;
private volatile int step;
private volatile int minStep;
private volatile long updateTimestamp;
public SegmentBuffer() {
segments = new Segment[]{new Segment(this), new Segment(this)};
currentPos = 0;
nextReady = false;
initOk = false;
threadRunning = new AtomicBoolean(false);
lock = new ReentrantReadWriteLock();
}
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public Segment[] getSegments() {
return segments;
}
public Segment getCurrent() {
return segments[currentPos];
}
public int getCurrentPos() {
return currentPos;
}
public int nextPos() {
return (currentPos + 1) % 2;
}
public void switchPos() {
currentPos = nextPos();
}
public boolean isInitOk() {
return initOk;
}
public void setInitOk(boolean initOk) {
this.initOk = initOk;
}
public boolean isNextReady() {
return nextReady;
}
public void setNextReady(boolean nextReady) {
this.nextReady = nextReady;
}
public AtomicBoolean getThreadRunning() {
return threadRunning;
}
public Lock rLock() {
return lock.readLock();
}
public Lock wLock() {
return lock.writeLock();
}
public int getStep() {
return step;
}
public void setStep(int step) {
this.step = step;
}
public int getMinStep() {
return minStep;
}
public void setMinStep(int minStep) {
this.minStep = minStep;
}
public long getUpdateTimestamp() {
return updateTimestamp;
}
public void setUpdateTimestamp(long updateTimestamp) {
this.updateTimestamp = updateTimestamp;
}
@Override
public String toString() {
final StringBuilder sb = new StringBuilder("SegmentBuffer{");
sb.append("key='").append(key).append('\'');
sb.append(", segments=").append(Arrays.toString(segments));
sb.append(", currentPos=").append(currentPos);
sb.append(", nextReady=").append(nextReady);
sb.append(", initOk=").append(initOk);
sb.append(", threadRunning=").append(threadRunning);
sb.append(", step=").append(step);
sb.append(", minStep=").append(minStep);
sb.append(", updateTimestamp=").append(updateTimestamp);
sb.append('}');
return sb.toString();
}
}
package com.sankuai.inf.leaf.snowflake;
import com.google.common.base.Preconditions;
import com.sankuai.inf.leaf.IDGen;
import com.sankuai.inf.leaf.common.Result;
import com.sankuai.inf.leaf.common.Status;
import com.sankuai.inf.leaf.common.Utils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Random;
public class SnowflakeIDGenImpl implements IDGen {
@Override
public boolean init() {
return true;
}
private static final Logger LOGGER = LoggerFactory.getLogger(SnowflakeIDGenImpl.class);
private final long twepoch;
private final long workerIdBits = 10L;
private final long maxWorkerId = ~(-1L << workerIdBits);//最大能够分配的workerid =1023
private final long sequenceBits = 12L;
private final long workerIdShift = sequenceBits;
private final long timestampLeftShift = sequenceBits + workerIdBits;
private final long sequenceMask = ~(-1L << sequenceBits);
private long workerId;
private long sequence = 0L;
private long lastTimestamp = -1L;
private static final Random RANDOM = new Random();
public SnowflakeIDGenImpl(String zkAddress, int port) {
//Thu Nov 04 2010 09:42:54 GMT+0800 (中国标准时间)
this(zkAddress, port, 1288834974657L);
}
/**
* @param zkAddress zk地址
* @param port snowflake监听端口
* @param twepoch 起始的时间戳
*/
public SnowflakeIDGenImpl(String zkAddress, int port, long twepoch) {
this.twepoch = twepoch;
Preconditions.checkArgument(timeGen() > twepoch, "Snowflake not support twepoch gt currentTime");
final String ip = Utils.getIp();
SnowflakeZookeeperHolder holder = new SnowflakeZookeeperHolder(ip, String.valueOf(port), zkAddress);
LOGGER.info("twepoch:{} ,ip:{} ,zkAddress:{} port:{}", twepoch, ip, zkAddress, port);
boolean initFlag = holder.init();
if (initFlag) {
workerId = holder.getWorkerID();
LOGGER.info("START SUCCESS USE ZK WORKERID-{}", workerId);
} else {
Preconditions.checkArgument(initFlag, "Snowflake Id Gen is not init ok");
}
Preconditions.checkArgument(workerId >= 0 && workerId <= maxWorkerId, "workerID must gte 0 and lte 1023");
}
@Override
public synchronized Result get(String key) {
long timestamp = timeGen();
if (timestamp < lastTimestamp) {
long offset = lastTimestamp - timestamp;
if (offset <= 5) {
try {
wait(offset << 1);
timestamp = timeGen();
if (timestamp < lastTimestamp) {
return new Result(-1, Status.EXCEPTION);
}
} catch (InterruptedException e) {
LOGGER.error("wait interrupted");
return new Result(-2, Status.EXCEPTION);
}
} else {
return new Result(-3, Status.EXCEPTION);
}
}
if (lastTimestamp == timestamp) {
sequence = (sequence + 1) & sequenceMask;
if (sequence == 0) {
//seq 为0的时候表示是下一毫秒时间开始对seq做随机
sequence = RANDOM.nextInt(100);
timestamp = tilNextMillis(lastTimestamp);
}
} else {
//如果是新的ms开始
sequence = RANDOM.nextInt(100);
}
lastTimestamp = timestamp;
long id = ((timestamp - twepoch) << timestampLeftShift) | (workerId << workerIdShift) | sequence;
return new Result(id, Status.SUCCESS);
}
protected long tilNextMillis(long lastTimestamp) {
long timestamp = timeGen();
while (timestamp <= lastTimestamp) {
timestamp = timeGen();
}
return timestamp;
}
protected long timeGen() {
return System.currentTimeMillis();
}
public long getWorkerId() {
return workerId;
}
}
package com.sankuai.inf.leaf.snowflake.exception;
public class CheckLastTimeException extends RuntimeException {
public CheckLastTimeException(String msg){
super(msg);
}
}
package com.sankuai.inf.leaf.snowflake.exception;
public class CheckOtherNodeException extends RuntimeException {
public CheckOtherNodeException(String message) {
super(message);
}
}
package com.sankuai.inf.leaf.snowflake.exception;
public class ClockGoBackException extends RuntimeException {
public ClockGoBackException(String message) {
super(message);
}
}
package com.sankuai.inf.leaf.segment;
import com.alibaba.druid.pool.DruidDataSource;
import com.sankuai.inf.leaf.IDGen;
import com.sankuai.inf.leaf.common.PropertyFactory;
import com.sankuai.inf.leaf.common.Result;
import com.sankuai.inf.leaf.segment.dao.IDAllocDao;
import com.sankuai.inf.leaf.segment.dao.impl.IDAllocDaoImpl;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.io.IOException;
import java.sql.SQLException;
import java.util.Properties;
public class IDGenServiceTest {
IDGen idGen;
DruidDataSource dataSource;
@Before
public void before() throws IOException, SQLException {
// Load Db Config
Properties properties = PropertyFactory.getProperties();
// Config dataSource
dataSource = new DruidDataSource();
dataSource.setUrl(properties.getProperty("jdbc.url"));
dataSource.setUsername(properties.getProperty("jdbc.username"));
dataSource.setPassword(properties.getProperty("jdbc.password"));
dataSource.init();
// Config Dao
IDAllocDao dao = new IDAllocDaoImpl(dataSource);
// Config ID Gen
idGen = new SegmentIDGenImpl();
((SegmentIDGenImpl) idGen).setDao(dao);
idGen.init();
}
@Test
public void testGetId() {
for (int i = 0; i < 100; ++i) {
Result r = idGen.get("leaf-segment-test");
System.out.println(r);
}
}
@After
public void after() {
dataSource.close();
}
}
package com.sankuai.inf.leaf.segment;
import com.sankuai.inf.leaf.IDGen;
import com.sankuai.inf.leaf.common.Result;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations={"classpath:applicationContext.xml"}) //加载配置文件
public class SpringIDGenServiceTest {
@Autowired
IDGen idGen;
@Test
public void testGetId() {
for (int i = 0; i < 100; i++) {
Result r = idGen.get("leaf-segment-test");
System.out.println(r);
}
}
}
package com.sankuai.inf.leaf.snowflake;
import com.sankuai.inf.leaf.IDGen;
import com.sankuai.inf.leaf.common.PropertyFactory;
import com.sankuai.inf.leaf.common.Result;
import org.junit.Test;
import java.util.Properties;
public class SnowflakeIDGenImplTest {
@Test
public void testGetId() {
Properties properties = PropertyFactory.getProperties();
properties.setProperty("leaf.zk.list", "localhost:2181");
properties.setProperty("leaf.name", "leaf_test");
IDGen idGen = new SnowflakeIDGenImpl(properties.getProperty("leaf.zk.list"), 8080);
for (int i = 1; i < 1000; ++i) {
Result r = idGen.get("a");
System.out.println(r);
}
}
}
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:tx="http://www.springframework.org/schema/tx"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd">
<context:property-placeholder location="classpath:leaf.properties"/>
<bean id="dataSource" class="com.alibaba.druid.pool.DruidDataSource"
init-method="init" destroy-method="close">
<property name="url" value="${jdbc.url}"/>
<property name="username" value="${jdbc.username}"/>
<property name="password" value="${jdbc.password}"/>
<!-- 配置获取连接等待超时的时间 -->
<property name="maxWait" value="10000"/>
<property name="timeBetweenEvictionRunsMillis" value="60000"/>
<property name="minEvictableIdleTimeMillis" value="300000"/>
<property name="testWhileIdle" value="true"/>
<property name="testOnBorrow" value="true"/>
<property name="testOnReturn" value="false"/>
<property name="validationQuery" value="select 1 "/>
</bean>
<bean id="idAllocDao" class="com.sankuai.inf.leaf.segment.dao.impl.IDAllocDaoImpl">
<constructor-arg ref="dataSource"/>
</bean>
<bean name="leaf" class="com.sankuai.inf.leaf.segment.SegmentIDGenImpl" init-method="init">
<property name="dao" ref="idAllocDao"/>
</bean>
</beans>
\ No newline at end of file
leaf.name=default
leaf.zk.list=localhost:2181
#jdbc.url=
#jdbc.username=
#jdbc.password=
<?xml version="1.0" encoding="UTF-8"?>
<configuration status="info">
<appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n"/>
</Console>
</appenders>
<loggers>
<root level="info">
<appender-ref ref="Console"/>
</root>
</loggers>
</configuration>
#!/usr/bin/env bash
# 程序执行的根路径
PROJ_DIR=./target
# 程序目标
PROJ_TARGET_JAR=${PROJ_DIR}/leaf.jar
JAVA_CMD=java
JVM_ARGS="-jar"
CMD="${JAVA_CMD} ${JVM_ARGS} ${PROJ_TARGET_JAR}"
echo "Leaf Start--------------"
echo "JVM ARGS ${JVM_ARGS}"
echo "->"
echo "PROJ_TARGET_JAR ${PROJ_TARGET_JAR}"
echo "------------------------"
$CMD
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.sankuai.inf.leaf</groupId>
<artifactId>leaf-parent</artifactId>
<version>1.0.1</version>
</parent>
<groupId>com.sankuai.inf.leaf</groupId>
<artifactId>leaf-server</artifactId>
<version>1.0.1</version>
<packaging>jar</packaging>
<name>leaf-server</name>
<description>Leaf HTTP Server</description>
<properties>
<spring-boot-dependencies.version>2.0.6.RELEASE</spring-boot-dependencies.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>${spring-boot-dependencies.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- Spring Boot Freemarker 依赖 -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-freemarker</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.sankuai.inf.leaf</groupId>
<artifactId>leaf-core</artifactId>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid</artifactId>
</dependency>
<!--zk-->
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-recipes</artifactId>
<exclusions>
<exclusion>
<artifactId>log4j</artifactId>
<groupId>log4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${spring-boot-dependencies.version}</version>
<configuration>
<mainClass>com.sankuai.inf.leaf.server.LeafServerApplication</mainClass>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
<excludeTransitive>false</excludeTransitive>
<!-- 表示复制的jar文件去掉版本信息 -->
<stripVersion>false</stripVersion>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
package com.sankuai.inf.leaf.server;
public class Constants {
public static final String LEAF_SEGMENT_ENABLE = "leaf.segment.enable";
public static final String LEAF_JDBC_URL = "leaf.jdbc.url";
public static final String LEAF_JDBC_USERNAME = "leaf.jdbc.username";
public static final String LEAF_JDBC_PASSWORD = "leaf.jdbc.password";
public static final String LEAF_SNOWFLAKE_ENABLE = "leaf.snowflake.enable";
public static final String LEAF_SNOWFLAKE_PORT = "leaf.snowflake.port";
public static final String LEAF_SNOWFLAKE_ZK_ADDRESS = "leaf.snowflake.zk.address";
}
package com.sankuai.inf.leaf.server;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class LeafServerApplication {
public static void main(String[] args) {
SpringApplication.run(LeafServerApplication.class, args);
}
}
package com.sankuai.inf.leaf.server.controller;
import com.sankuai.inf.leaf.common.Result;
import com.sankuai.inf.leaf.common.Status;
import com.sankuai.inf.leaf.server.exception.LeafServerException;
import com.sankuai.inf.leaf.server.exception.NoKeyException;
import com.sankuai.inf.leaf.server.service.SegmentService;
import com.sankuai.inf.leaf.server.service.SnowflakeService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
@RestController
public class LeafController {
private Logger logger = LoggerFactory.getLogger(LeafController.class);
@Autowired
private SegmentService segmentService;
@Autowired
private SnowflakeService snowflakeService;
@RequestMapping(value = "/api/segment/get/{key}")
public String getSegmentId(@PathVariable("key") String key) {
return get(key, segmentService.getId(key));
}
@RequestMapping(value = "/api/snowflake/get/{key}")
public String getSnowflakeId(@PathVariable("key") String key) {
return get(key, snowflakeService.getId(key));
}
private String get(@PathVariable("key") String key, Result id) {
Result result;
if (key == null || key.isEmpty()) {
throw new NoKeyException();
}
result = id;
if (result.getStatus().equals(Status.EXCEPTION)) {
throw new LeafServerException(result.toString());
}
return String.valueOf(result.getId());
}
}
package com.sankuai.inf.leaf.server.controller;
import com.sankuai.inf.leaf.segment.SegmentIDGenImpl;
import com.sankuai.inf.leaf.server.model.SegmentBufferView;
import com.sankuai.inf.leaf.segment.model.LeafAlloc;
import com.sankuai.inf.leaf.segment.model.SegmentBuffer;
import com.sankuai.inf.leaf.server.service.SegmentService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.ResponseBody;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@Controller
public class LeafMonitorController {
private Logger logger = LoggerFactory.getLogger(LeafMonitorController.class);
@Autowired
private SegmentService segmentService;
@RequestMapping(value = "cache")
public String getCache(Model model) {
Map<String, SegmentBufferView> data = new HashMap<>();
SegmentIDGenImpl segmentIDGen = segmentService.getIdGen();
if (segmentIDGen == null) {
throw new IllegalArgumentException("You should config leaf.segment.enable=true first");
}
Map<String, SegmentBuffer> cache = segmentIDGen.getCache();
for (Map.Entry<String, SegmentBuffer> entry : cache.entrySet()) {
SegmentBufferView sv = new SegmentBufferView();
SegmentBuffer buffer = entry.getValue();
sv.setInitOk(buffer.isInitOk());
sv.setKey(buffer.getKey());
sv.setPos(buffer.getCurrentPos());
sv.setNextReady(buffer.isNextReady());
sv.setMax0(buffer.getSegments()[0].getMax());
sv.setValue0(buffer.getSegments()[0].getValue().get());
sv.setStep0(buffer.getSegments()[0].getStep());
sv.setMax1(buffer.getSegments()[1].getMax());
sv.setValue1(buffer.getSegments()[1].getValue().get());
sv.setStep1(buffer.getSegments()[1].getStep());
data.put(entry.getKey(), sv);
}
logger.info("Cache info {}", data);
model.addAttribute("data", data);
return "segment";
}
@RequestMapping(value = "db")
public String getDb(Model model) {
SegmentIDGenImpl segmentIDGen = segmentService.getIdGen();
if (segmentIDGen == null) {
throw new IllegalArgumentException("You should config leaf.segment.enable=true first");
}
List<LeafAlloc> items = segmentIDGen.getAllLeafAllocs();
logger.info("DB info {}", items);
model.addAttribute("items", items);
return "db";
}
/**
* the output is like this:
* {
* "timestamp": "1567733700834(2019-09-06 09:35:00.834)",
* "sequenceId": "3448",
* "workerId": "39"
* }
*/
@RequestMapping(value = "decodeSnowflakeId")
@ResponseBody
public Map<String, String> decodeSnowflakeId(@RequestParam("snowflakeId") String snowflakeIdStr) {
Map<String, String> map = new HashMap<>();
try {
long snowflakeId = Long.parseLong(snowflakeIdStr);
long originTimestamp = (snowflakeId >> 22) + 1288834974657L;
Date date = new Date(originTimestamp);
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
map.put("timestamp", String.valueOf(originTimestamp) + "(" + sdf.format(date) + ")");
long workerId = (snowflakeId >> 12) ^ (snowflakeId >> 22 << 10);
map.put("workerId", String.valueOf(workerId));
long sequence = snowflakeId ^ (snowflakeId >> 12 << 12);
map.put("sequenceId", String.valueOf(sequence));
} catch (NumberFormatException e) {
map.put("errorMsg", "snowflake Id反解析发生异常!");
}
return map;
}
}
package com.sankuai.inf.leaf.server.exception;
public class InitException extends Exception{
public InitException(String msg) {
super(msg);
}
}
package com.sankuai.inf.leaf.server.exception;
import org.springframework.http.HttpStatus;
import org.springframework.web.bind.annotation.ResponseStatus;
@ResponseStatus(code=HttpStatus.INTERNAL_SERVER_ERROR)
public class LeafServerException extends RuntimeException {
public LeafServerException(String msg) {
super(msg);
}
}
package com.sankuai.inf.leaf.server.exception;
import org.springframework.http.HttpStatus;
import org.springframework.web.bind.annotation.ResponseStatus;
@ResponseStatus(code=HttpStatus.INTERNAL_SERVER_ERROR,reason="Key is none")
public class NoKeyException extends RuntimeException {
}
package com.sankuai.inf.leaf.server.model;
public class SegmentBufferView {
private String key;
private long value0;
private int step0;
private long max0;
private long value1;
private int step1;
private long max1;
private int pos;
private boolean nextReady;
private boolean initOk;
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public long getValue1() {
return value1;
}
public void setValue1(long value1) {
this.value1 = value1;
}
public int getStep1() {
return step1;
}
public void setStep1(int step1) {
this.step1 = step1;
}
public long getMax1() {
return max1;
}
public void setMax1(long max1) {
this.max1 = max1;
}
public long getValue0() {
return value0;
}
public void setValue0(long value0) {
this.value0 = value0;
}
public int getStep0() {
return step0;
}
public void setStep0(int step0) {
this.step0 = step0;
}
public long getMax0() {
return max0;
}
public void setMax0(long max0) {
this.max0 = max0;
}
public int getPos() {
return pos;
}
public void setPos(int pos) {
this.pos = pos;
}
public boolean isNextReady() {
return nextReady;
}
public void setNextReady(boolean nextReady) {
this.nextReady = nextReady;
}
public boolean isInitOk() {
return initOk;
}
public void setInitOk(boolean initOk) {
this.initOk = initOk;
}
}
package com.sankuai.inf.leaf.server.service;
import com.alibaba.druid.pool.DruidDataSource;
import com.sankuai.inf.leaf.IDGen;
import com.sankuai.inf.leaf.common.PropertyFactory;
import com.sankuai.inf.leaf.common.Result;
import com.sankuai.inf.leaf.common.ZeroIDGen;
import com.sankuai.inf.leaf.segment.SegmentIDGenImpl;
import com.sankuai.inf.leaf.segment.dao.IDAllocDao;
import com.sankuai.inf.leaf.segment.dao.impl.IDAllocDaoImpl;
import com.sankuai.inf.leaf.server.Constants;
import com.sankuai.inf.leaf.server.exception.InitException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import java.sql.SQLException;
import java.util.Properties;
@Service("SegmentService")
public class SegmentService {
private Logger logger = LoggerFactory.getLogger(SegmentService.class);
private IDGen idGen;
private DruidDataSource dataSource;
public SegmentService() throws SQLException, InitException {
Properties properties = PropertyFactory.getProperties();
boolean flag = Boolean.parseBoolean(properties.getProperty(Constants.LEAF_SEGMENT_ENABLE, "true"));
if (flag) {
// Config dataSource
dataSource = new DruidDataSource();
dataSource.setUrl(properties.getProperty(Constants.LEAF_JDBC_URL));
dataSource.setUsername(properties.getProperty(Constants.LEAF_JDBC_USERNAME));
dataSource.setPassword(properties.getProperty(Constants.LEAF_JDBC_PASSWORD));
dataSource.init();
// Config Dao
IDAllocDao dao = new IDAllocDaoImpl(dataSource);
// Config ID Gen
idGen = new SegmentIDGenImpl();
((SegmentIDGenImpl) idGen).setDao(dao);
if (idGen.init()) {
logger.info("Segment Service Init Successfully");
} else {
throw new InitException("Segment Service Init Fail");
}
} else {
idGen = new ZeroIDGen();
logger.info("Zero ID Gen Service Init Successfully");
}
}
public Result getId(String key) {
return idGen.get(key);
}
public SegmentIDGenImpl getIdGen() {
if (idGen instanceof SegmentIDGenImpl) {
return (SegmentIDGenImpl) idGen;
}
return null;
}
}
package com.sankuai.inf.leaf.server.service;
import com.sankuai.inf.leaf.IDGen;
import com.sankuai.inf.leaf.common.PropertyFactory;
import com.sankuai.inf.leaf.common.Result;
import com.sankuai.inf.leaf.common.ZeroIDGen;
import com.sankuai.inf.leaf.server.Constants;
import com.sankuai.inf.leaf.server.exception.InitException;
import com.sankuai.inf.leaf.snowflake.SnowflakeIDGenImpl;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import java.util.Properties;
@Service("SnowflakeService")
public class SnowflakeService {
private Logger logger = LoggerFactory.getLogger(SnowflakeService.class);
private IDGen idGen;
public SnowflakeService() throws InitException {
Properties properties = PropertyFactory.getProperties();
boolean flag = Boolean.parseBoolean(properties.getProperty(Constants.LEAF_SNOWFLAKE_ENABLE, "true"));
if (flag) {
String zkAddress = properties.getProperty(Constants.LEAF_SNOWFLAKE_ZK_ADDRESS);
int port = Integer.parseInt(properties.getProperty(Constants.LEAF_SNOWFLAKE_PORT));
idGen = new SnowflakeIDGenImpl(zkAddress, port);
if(idGen.init()) {
logger.info("Snowflake Service Init Successfully");
} else {
throw new InitException("Snowflake Service Init Fail");
}
} else {
idGen = new ZeroIDGen();
logger.info("Zero ID Gen Service Init Successfully");
}
}
public Result getId(String key) {
return idGen.get(key);
}
}
server.port=9090
spring.application.name=leaf-server
spring.freemarker.cache=false
spring.freemarker.charset=UTF-8
spring.freemarker.check-template-location=true
spring.freemarker.content-type=text/html
spring.freemarker.expose-request-attributes=true
spring.freemarker.expose-session-attributes=true
spring.freemarker.request-context-attribute=request
leaf.name=com.sankuai.leaf.opensource.test
leaf.segment.enable=false
#leaf.jdbc.url=
#leaf.jdbc.username=
#leaf.jdbc.password=
leaf.snowflake.enable=false
#leaf.snowflake.zk.address=
#leaf.snowflake.port=
\ No newline at end of file
This source diff could not be displayed because it is too large. You can view the blob instead.
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<title>Leaf</title>
<link href="/css/bootstrap.min.css" rel="stylesheet">
</head>
<body>
<table class="table table-hover">
<thead>
<tr>
<th>tag</th>
<th>max</th>
<th>step</th>
<th>update</th>
</tr>
</thead>
<tbody>
<#if items?exists>
<#list items as item>
<tr>
<td>${item.key}</td>
<td>${item.maxId}</td>
<td>${item.step}</td>
<td>${item.updateTime}</td>
</tr>
<tr>
</tr>
</#list>
</#if>
<tbody>
</table>
</body>
</html>
\ No newline at end of file
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<title>Leaf</title>
<link href="/css/bootstrap.min.css" rel="stylesheet">
</head>
<body>
<table class="table table-hover">
<thead>
<tr>
<th>name</th>
<th>init</th>
<th>next</th>
<th>pos</th>
<th>value0</th>
<th>max0</th>
<th>step0</th>
<th>value1</th>
<th>max1</th>
<th>step1</th>
</tr>
</thead>
<tbody>
<#if data?exists>
<#list data?keys as key>
<tr>
<td>${key}</td>
<td>${data[key].initOk?string('true','false')}</td>
<td>${data[key].nextReady?string('true','false')}</td>
<td>${data[key].pos}</td>
<td>${data[key].value0}</td>
<td>${data[key].max0}</td>
<td>${data[key].step0}</td>
<td>${data[key].value1}</td>
<td>${data[key].max1}</td>
<td>${data[key].step1}</td>
</tr>
<tr>
</tr>
</#list>
</#if>
<tbody>
</table>
</body>
</html>
\ No newline at end of file
package com.sankuai.inf.leaf.server;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
@RunWith(SpringRunner.class)
@SpringBootTest
public class LeafServerApplicationTests {
@Test
public void contextLoads() {
}
}
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sankuai.inf.leaf</groupId>
<artifactId>leaf-parent</artifactId>
<packaging>pom</packaging>
<version>1.0.1</version>
<name>Leaf</name>
<modules>
<module>leaf-core</module>
<module>leaf-server</module>
</modules>
<description>Distributed ID Generate Service</description>
<developers>
<developer>
<id>zhangzhitong</id>
<name>zhangzhitong</name>
<email>zhitongzhang@outlook.com</email>
</developer>
<developer>
<id>zhanghan</id>
<name>zhanghan</name>
<email>han122655904@163.com</email>
</developer>
<developer>
<id>xiezhaodong</id>
<name>xiezhaodong</name>
<email>pursuer_xie@foxmail.com</email>
</developer>
</developers>
<organization>
<name>Meituan-Dianping Group</name>
<url>https://github.com/Meituan-Dianping</url>
</organization>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<spring.version>5.0.13.RELEASE</spring.version>
<junit.version>4.12</junit.version>
<maven.compiler.version>3.5.1</maven.compiler.version>
<perf4j.version>0.9.16</perf4j.version>
<curator.version>4.0.1</curator.version>
<slf4j.version>1.7.25</slf4j.version>
<druid.version>1.1.12</druid.version>
<jackson-databind.version>2.9.10</jackson-databind.version>
<mysql-connector-java.version>5.1.47</mysql-connector-java.version>
<commons-io.version>2.6</commons-io.version>
<log4j.version>2.7</log4j.version>
<mybatis.version>3.5.3</mybatis.version>
<mybatis-spring.version>2.0.3</mybatis-spring.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.sankuai.inf.leaf</groupId>
<artifactId>leaf-core</artifactId>
<version>1.0.1</version>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid</artifactId>
<version>${druid.version}</version>
</dependency>
<!--zk-->
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-recipes</artifactId>
<version>${curator.version}</version>
<exclusions>
<exclusion>
<artifactId>log4j</artifactId>
<groupId>log4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>${commons-io.version}</version>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis</artifactId>
<version>${mybatis.version}</version>
</dependency>
<dependency>
<groupId>org.perf4j</groupId>
<artifactId>perf4j</artifactId>
<version>${perf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>${mysql-connector-java.version}</version>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis-spring</artifactId>
<version>${mybatis-spring.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<finalName>leaf</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.version}</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.7.5.201505241946</version>
<executions>
<execution>
<id>pre-unit-test</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<destFile>
${project.build.directory}/${project.artifactId}-jacoco.exec
</destFile>
</configuration>
</execution>
<execution>
<id>post-unit-test</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<dataFile>
${project.build.directory}/${project.artifactId}-jacoco.exec
</dataFile>
<outputDirectory>${project.build.directory}/jacoco</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<distributionManagement>
<repository>
<id>nexus-releases</id>
<name>Nexus Release Repository</name>
<url>http://39.100.254.140:12010/repository/maven-releases/</url>
</repository>
<snapshotRepository>
<id>nexus-snapshots</id>
<name>Nexus Snapshot Repository</name>
<url>http://39.100.254.140:12010/repository/maven-snapshots/</url>
</snapshotRepository>
</distributionManagement>
<repositories>
<repository>
<id>nexus-loit-dev</id>
<name>Nexus Repository</name>
<url>http://39.100.254.140:12010/repository/maven-public/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>true</enabled>
</releases>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>nexus-loit-dev</id>
<name>Nexus Plugin Repository</name>
<url>http://39.100.254.140:12010/repository/maven-public/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>true</enabled>
</releases>
</pluginRepository>
</pluginRepositories>
</project>
DROP TABLE IF EXISTS `leaf_alloc`;
CREATE TABLE `leaf_alloc` (
`biz_tag` varchar(128) NOT NULL DEFAULT '',
`max_id` bigint(20) NOT NULL DEFAULT '1',
`step` int(11) NOT NULL,
`description` varchar(256) DEFAULT NULL,
`update_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY (`biz_tag`)
) ENGINE=InnoDB;
\ No newline at end of file
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论