首页 > 代码库 > 使用Python实现清除RabbitMQ里面1小时没有数据的连接
使用Python实现清除RabbitMQ里面1小时没有数据的连接
公司RabbitMQ消费者重新连接后旧连接不会断开,会一直存在于服务端,至于是什么原因目前还没有找到。这样导致连接数每过一段时间就增加很多,所以写了一个Python脚本来删除无效的连接,根据每个连接里面的数据传输来定义是否是有效,如果1小时都没有数据传输那么就是无效,具体实现代码如下:
#!/usr/bin/python #coding:utf8 """ 删除RabbitMQ所有在1小时内没有数据的连接 """ from optparse import OptionParser import sys import json import urllib2 import requests #使用选项帮助信息可以使用中文 reload(sys) sys.setdefaultencoding("utf-8") usage = sys.argv[0] + " <Options>" parser = OptionParser(usage) parser.add_option("-v", dest="vhost", action="store", default=False, help="Select vhost, Defaulte All") options, args = parser.parse_args() class RabbitMQ: def __init__(self, user=‘guest‘, passwd=‘guest‘, server_ip=‘192.168.1.31‘): self.user = user self.password = passwd self.server_ip = server_ip def getAllConnections(self): """ 获取所有连接返回Json """ if not options.vhost: connections = requests.get("http://{0}:15672/api/connections".format(self.server_ip), auth=(self.user, self.password)) else: connections = requests.get("http://{0}:15672/api/vhosts/{1}/connections".format(self.server_ip, options.vhost), auth=(self.user, self.password)) connections = connections.json() return connections def connectionsNumber(self): """ 对上面返回的Json进行处理,获取关键数据,返回列表 """ list1 = [] data = self.getAllConnections() for i in data: list1.append(i[‘name‘]) return list1 def url(self): """ 生成最终访问API,返回列表 """ url_list = [] data = self.connectionsNumber() for i in data: send_ip = i.split("->")[0].split(":")[0] send_port = i.split("->")[0].split(":")[1].strip() Receive_ip = i.split("->")[1].split(":")[0].strip() url = ‘http://{0}:15672/api/connections/{1}%3A{2}%20-%3E%20{3}%3A5672?data_rates_age=3600&data_rates_incr=60‘.format(self.server_ip, send_ip, send_port, Receive_ip) url_list.append(url) return url_list def getAllData(self): """ 获取API的所有信息,返回列表 """ data = self.url() if self.server_ip == "10.8.5.3": authorization = "Basic Z3Vlc3Q6UmRuN3lsV2FmZWs2Sjk4aA==" else: authorization = "Basic Z3Vlc3Q6Z3Vlc3Q=" user_agent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36" list1 = [] for i in data: url = i request = urllib2.Request(i, headers={"User-Agent":user_agent, "Authorization":authorization}) req = urllib2.urlopen(request) list1.append(json.loads(req.read())) return list1 def getAvgRate(self): """ 获取每个连接的recv_oct_details和send_oct_details,根据1小时平均,如果无数据传输则为无效连接,生成删除API并删除 """ data = self.getAllData() for i in data: n = 2 recv_avg_rate = i.get("recv_oct_details").get("avg_rate") n += recv_avg_rate send_avg_rate = i.get("send_oct_details").get("avg_rate") n += send_avg_rate if n == 2: name = i.get("name") send_ip = name.split("->")[0].split(":")[0] send_port = name.split("->")[0].split(":")[1].strip() Receive_ip = name.split("->")[1].split(":")[0].strip() api = ‘http://{0}:15672/api/connections/{1}:{2}%20-%3E%20{3}:5672‘.format(self.server_ip, send_ip, send_port, Receive_ip) code = requests.delete(i, auth=(self.user, self.passwd)) if code.status_code == 204: print("{0}删除成功".format(send_port)) else: print("{0}删除失败".format(send_port)) sys.exit(1) print("done...") if __name__ == ‘__main__‘: mq = RabbitMQ() mq.getAvgRate()
本文出自 “蓝色_风暴” 博客,转载请与作者联系!
使用Python实现清除RabbitMQ里面1小时没有数据的连接
声明:以上内容来自用户投稿及互联网公开渠道收集整理发布,本网站不拥有所有权,未作人工编辑处理,也不承担相关法律责任,若内容有误或涉及侵权可进行投诉: 投诉/举报 工作人员会在5个工作日内联系你,一经查实,本站将立刻删除涉嫌侵权内容。