C#如何解决分布式日志exceptionless的写入瓶颈

我们都知道在分布式日志当中,exceptionless客户端是把日志写到

Elasticsearch数据库,就像我们把数据写入到关系数据库一样;

既然是写入,那么在短时间大数据量的情况下,写入就会涉及到效率的问题;.首先我们看下 exceptionless源码

public class InMemoryObjectStorage : IObjectStorage {
        private readonly Dictionary<string, Tuple<ObjectInfo, object>> _storage = new Dictionary<string, Tuple<ObjectInfo, object>>(StringComparer.OrdinalIgnoreCase);
        private readonly object _lock = new object();

        public InMemoryObjectStorage() : this(1000) {}

        public InMemoryObjectStorage(int maxObjects) {
            MaxObjects = maxObjects;
        }

        public long MaxObjects { get; set; }

        public int Count {
            get { return _storage.Count; }
        }

我们写入到日志的方法  ,首先会把日志写到内存当中;但是为了防止日志过大,导致服务器的内存被过度消耗的问题 ,这边是限制了 日志的条数的 1000;

当有新的日志进来时,如果是超过1000,那么旧的日志是会被抛弃的;

 public bool SaveObject<T>(string path, T value) where T : class {
            if (String.IsNullOrWhiteSpace(path))
                throw new ArgumentNullException("path");

            lock (_lock) {
                _storage[path] = Tuple.Create(new ObjectInfo {
                    Created = DateTime.Now,
                    Modified = DateTime.Now,
                    Path = path
                }, (object)value);

                if (_storage.Count > MaxObjects)
                    _storage.Remove(_storage.OrderByDescending(kvp => kvp.Value.Item1.Created).First().Key);
            }

            return true;
        }

这显然不是 我们想要的,我们需要的是完整的日志;

 public SubmissionResponse PostEvents(IEnumerable<Event> events, ExceptionlessConfiguration config, IJsonSerializer serializer) {
            if (!config.IsValid)
                return new SubmissionResponse(500, message: "Invalid client configuration settings");

            string data = serializer.Serialize(events);
            string url = String.Format("{0}/events", GetServiceEndPoint(config));

            HttpResponseMessage response;
            try {
                HttpContent content = new StringContent(data, Encoding.UTF8, "application/json");

                // don't compress data smaller than 4kb
                if (data.Length > 1024 * 4)
                    content = new GzipContent(content);

                _client.Value.AddAuthorizationHeader(config.ApiKey);
                response = _client.Value.PostAsync(url, content).ConfigureAwait(false).GetAwaiter().GetResult();
            } catch (Exception ex) {
                return new SubmissionResponse(500, exception: ex);
            }

            int settingsVersion;
            if (Int32.TryParse(GetSettingsVersionHeader(response.Headers), out settingsVersion))
                SettingsManager.CheckVersion(settingsVersion, config);

            return new SubmissionResponse((int)response.StatusCode, GetResponseMessage(response));
        }

所以我们需要在PostEvents 这个方法做文章;

目前我们的做法是:

就是把日志直接写到消息队列,然后做一个消费日志的订阅队列的服务,从队列读取数据,在写到Elasticsearch。