{ "cells": [ { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "import glob\n", "import ujson" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "--2020-04-27 18:20:02-- http://mklab2.iti.gr/multisensor/images/5/5c/WikiRef_dataset.zip\n", "Resolving mklab2.iti.gr (mklab2.iti.gr)... 160.40.50.223\n", "Connecting to mklab2.iti.gr (mklab2.iti.gr)|160.40.50.223|:80... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 1345469 (1,3M) [application/zip]\n", "Saving to: ‘WikiRef_dataset.zip’\n", "\n", "WikiRef_dataset.zip 100%[===================>] 1,28M 2,32MB/s in 0,6s \n", "\n", "2020-04-27 18:20:03 (2,32 MB/s) - ‘WikiRef_dataset.zip’ saved [1345469/1345469]\n", "\n" ] } ], "source": [ "! wget http://mklab2.iti.gr/multisensor/images/5/5c/WikiRef_dataset.zip" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "WikiRef Dataset\r", "\r\n", "\r", "\r\n", "The selected topics of the WikiRef220 dataset (and the number of articles per topic) are: \r", "\r\n", "Paris Attacks November 2015 (36), Barack Obama (5), Premier League (37), Cypriot Financial Crisis 2012-2013 (5), Rolling Stones (1), Debt Crisis in Greece (5), Samsung Galaxy S5 (35), Greek Elections June 2012 (5), smartphone (5), Malaysia Airlines Flight 370 (39), Stephen Hawking (1), Michelle Obama (38), Tohoku earthquake and tsunami (5), NBA draft (1), U2 (1), Wall Street (1). \r", "\r\n", "The topics Barack Obama, Cypriot Financial Crisis 2012-2013, Rolling Stones, Debt Crisis in Greece, Greek Elections June 2012, smartphone, Stephen Hawking, Tohoku earthquake and tsunami, NBA draft, U2 and Wall Street appear no more than 5 times and therefore, they are regarded as noise. \r", "\r\n", "\r", "\r\n", "The remaining 5 topics of WikiRef220 are: \r", "\r\n", "�\tParis Attacks November 2015 [1]\r", "\r\n", "�\tPremier League [2]\r", "\r\n", "�\tMalaysia Airlines Flight 370 [3] \r", "\r\n", "�\tSamsung Galaxy S5 [4]\r", "\r\n", "�\tMichelle Obama [5]\r", "\r\n", "\r", "\r\n", "The WikiRef186 dataset (4 topics) is the WikiRef220 without 34 documents related to �Malaysia Airlines Flight 370� and the WikiRef150 dataset (3 topics) is the WikiRef186 without the 36 documents related to �Paris Attacks�. \r", "\r\n", "\r", "\r\n", "References\r", "\r\n", "1. https://en.wikipedia.org/wiki/November_2015_Paris_attacks\r", "\r\n", "2. https://en.wikipedia.org/wiki/Premier_League\r", "\r\n", "3. https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_370\r", "\r\n", "4. https://en.wikipedia.org/wiki/Samsung_Galaxy_S5\r", "\r\n", "5. https://en.wikipedia.org/wiki/Michelle_Obama\r", "\r\n", "\r", "\r\n", "\r", "\r\n", "If you use this dataset, please cite:\r", "\r\n", "\r", "\r\n", "Gialampoukidis, I., Vrochidis, S., & Kompatsiaris, I. (2016). A Hybrid Framework for News Clustering Based on the DBSCAN-Martingale and LDA. In Machine Learning and Data Mining in Pattern Recognition (pp. 170-184). Springer International Publishing.\r", "\r\n", "\r", "\r\n", "\r", "\r\n", "\r", "\r\n" ] } ], "source": [ "! unzip -q WikiRef_dataset.zip \n", "! cat WikiRef_dataset/readme.txt\n" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " barack.obama.1.txt\t\t 'ParisAttacks2015 (21).txt'\r\n", " barack.obama.24.txt\t\t 'ParisAttacks2015 (22).txt'\r\n", " barack.obama.3.txt\t\t 'ParisAttacks2015 (23).txt'\r\n", " barack.obama.4.txt\t\t 'ParisAttacks2015 (24).txt'\r\n", " barack.obama.9.txt\t\t 'ParisAttacks2015 (25).txt'\r\n", " cypriot.financial.crisis.36.txt 'ParisAttacks2015 (26).txt'\r\n", " cypriot.financial.crisis.37.txt 'ParisAttacks2015 (27).txt'\r\n", " cypriot.financial.crisis.38.txt 'ParisAttacks2015 (28).txt'\r\n", " cypriot.financial.crisis.39.txt 'ParisAttacks2015 (29).txt'\r\n", " cypriot.financial.crisis.40.txt 'ParisAttacks2015 (2).txt'\r\n", " debt.crisis.greece.178.txt\t 'ParisAttacks2015 (30).txt'\r\n", " debt.crisis.greece.179.txt\t 'ParisAttacks2015 (31).txt'\r\n", " debt.crisis.greece.207.txt\t 'ParisAttacks2015 (32).txt'\r\n", " debt.crisis.greece.263.txt\t 'ParisAttacks2015 (33).txt'\r\n", " debt.crisis.greece.81.txt\t 'ParisAttacks2015 (34).txt'\r\n", " greek.elections.112.txt\t 'ParisAttacks2015 (35).txt'\r\n", " greek.elections.113.txt\t 'ParisAttacks2015 (36).txt'\r\n", " greek.elections.114.txt\t 'ParisAttacks2015 (3).txt'\r\n", " greek.elections.115.txt\t 'ParisAttacks2015 (4).txt'\r\n", " greek.elections.117.txt\t 'ParisAttacks2015 (5).txt'\r\n", " malaysia.airlines.flight.10.txt 'ParisAttacks2015 (6).txt'\r\n", " malaysia.airlines.flight.134.txt 'ParisAttacks2015 (7).txt'\r\n", " malaysia.airlines.flight.137.txt 'ParisAttacks2015 (8).txt'\r\n", " malaysia.airlines.flight.13.txt 'ParisAttacks2015 (9).txt'\r\n", " malaysia.airlines.flight.142.txt premier.league.125.txt\r\n", " malaysia.airlines.flight.143.txt premier.league.126.txt\r\n", " malaysia.airlines.flight.14.txt premier.league.133.txt\r\n", " malaysia.airlines.flight.15.txt premier.league.134.txt\r\n", " malaysia.airlines.flight.16.txt premier.league.137.txt\r\n", " malaysia.airlines.flight.18.txt premier.league.140.txt\r\n", " malaysia.airlines.flight.21.txt premier.league.141.txt\r\n", " malaysia.airlines.flight.223.txt premier.league.142.txt\r\n", " malaysia.airlines.flight.225.txt premier.league.143.txt\r\n", " malaysia.airlines.flight.227.txt premier.league.17.txt\r\n", " malaysia.airlines.flight.22.txt premier.league.21.txt\r\n", " malaysia.airlines.flight.230.txt premier.league.22.txt\r\n", " malaysia.airlines.flight.236.txt premier.league.25.txt\r\n", " malaysia.airlines.flight.245.txt premier.league.26.txt\r\n", " malaysia.airlines.flight.24.txt premier.league.29.txt\r\n", " malaysia.airlines.flight.25.txt premier.league.3.txt\r\n", " malaysia.airlines.flight.276.txt premier.league.40.txt\r\n", " malaysia.airlines.flight.278.txt premier.league.44.txt\r\n", " malaysia.airlines.flight.29.txt premier.league.4.txt\r\n", " malaysia.airlines.flight.31.txt premier.league.54.txt\r\n", " malaysia.airlines.flight.32.txt premier.league.55.txt\r\n", " malaysia.airlines.flight.34.txt premier.league.56.txt\r\n", " malaysia.airlines.flight.36.txt premier.league.5.txt\r\n", " malaysia.airlines.flight.66.txt premier.league.61.txt\r\n", " malaysia.airlines.flight.70.txt premier.league.63.txt\r\n", " malaysia.airlines.flight.71.txt premier.league.64.txt\r\n", " malaysia.airlines.flight.74.txt premier.league.68.txt\r\n", " malaysia.airlines.flight.75.txt premier.league.72.txt\r\n", " malaysia.airlines.flight.76.txt premier.league.73.txt\r\n", " malaysia.airlines.flight.79.txt premier.league.77.txt\r\n", " malaysia.airlines.flight.83.txt premier.league.78.txt\r\n", " malaysia.airlines.flight.84.txt premier.league.79.txt\r\n", " malaysia.airlines.flight.86.txt premier.league.87.txt\r\n", " malaysia.airlines.flight.98.txt premier.league.89.txt\r\n", " malaysia.airlines.flight.9.txt premier.league.92.txt\r\n", " michelle.obama.100.txt\t\t premier.league.96.txt\r\n", " michelle.obama.101.txt\t\t premier.league.9.txt\r\n", " michelle.obama.102.txt\t\t Rolling.Stones.197.txt\r\n", " michelle.obama.103.txt\t\t samsung.galaxy.s5.13.txt\r\n", " michelle.obama.109.txt\t\t samsung.galaxy.s5.14.txt\r\n", " michelle.obama.111.txt\t\t samsung.galaxy.s5.15.txt\r\n", " michelle.obama.113.txt\t\t samsung.galaxy.s5.16.txt\r\n", " michelle.obama.114.txt\t\t samsung.galaxy.s5.19.txt\r\n", " michelle.obama.115.txt\t\t samsung.galaxy.s5.1.txt\r\n", " michelle.obama.118.txt\t\t samsung.galaxy.s5.21.txt\r\n", " michelle.obama.122.txt\t\t samsung.galaxy.s5.22.txt\r\n", " michelle.obama.123.txt\t\t samsung.galaxy.s5.25.txt\r\n", " michelle.obama.127.txt\t\t samsung.galaxy.s5.26.txt\r\n", " michelle.obama.12.txt\t\t samsung.galaxy.s5.27.txt\r\n", " michelle.obama.130.txt\t\t samsung.galaxy.s5.28.txt\r\n", " michelle.obama.138.txt\t\t samsung.galaxy.s5.29.txt\r\n", " michelle.obama.139.txt\t\t samsung.galaxy.s5.2.txt\r\n", " michelle.obama.142.txt\t\t samsung.galaxy.s5.30.txt\r\n", " michelle.obama.144.txt\t\t samsung.galaxy.s5.31.txt\r\n", " michelle.obama.146.txt\t\t samsung.galaxy.s5.32.txt\r\n", " michelle.obama.15.txt\t\t samsung.galaxy.s5.33.txt\r\n", " michelle.obama.20.txt\t\t samsung.galaxy.s5.34.txt\r\n", " michelle.obama.22.txt\t\t samsung.galaxy.s5.35.txt\r\n", " michelle.obama.23.txt\t\t samsung.galaxy.s5.36.txt\r\n", " michelle.obama.56.txt\t\t samsung.galaxy.s5.37.txt\r\n", " michelle.obama.62.txt\t\t samsung.galaxy.s5.38.txt\r\n", " michelle.obama.64.txt\t\t samsung.galaxy.s5.39.txt\r\n", " michelle.obama.67.txt\t\t samsung.galaxy.s5.3.txt\r\n", " michelle.obama.70.txt\t\t samsung.galaxy.s5.40.txt\r\n", " michelle.obama.73.txt\t\t samsung.galaxy.s5.41.txt\r\n", " michelle.obama.75.txt\t\t samsung.galaxy.s5.42.txt\r\n", " michelle.obama.79.txt\t\t samsung.galaxy.s5.43.txt\r\n", " michelle.obama.80.txt\t\t samsung.galaxy.s5.44.txt\r\n", " michelle.obama.83.txt\t\t samsung.galaxy.s5.45.txt\r\n", " michelle.obama.86.txt\t\t samsung.galaxy.s5.47.txt\r\n", " michelle.obama.95.txt\t\t samsung.galaxy.s5.4.txt\r\n", " michelle.obama.98.txt\t\t samsung.galaxy.s5.6.txt\r\n", " michelle.obama.9.txt\t\t samsung.galaxy.s5.7.txt\r\n", " NBA.draft.2.txt\t\t smartphone.107.txt\r\n", "'ParisAttacks2015 (10).txt'\t smartphone.129.txt\r\n", "'ParisAttacks2015 (11).txt'\t smartphone.74.txt\r\n", "'ParisAttacks2015 (12).txt'\t smartphone.78.txt\r\n", "'ParisAttacks2015 (13).txt'\t smartphone.95.txt\r\n", "'ParisAttacks2015 (14).txt'\t stephen.hawking.275.txt\r\n", "'ParisAttacks2015 (15).txt'\t tohoku.earthquake.tsunami.12.txt\r\n", "'ParisAttacks2015 (16).txt'\t tohoku.earthquake.tsunami.14.txt\r\n", "'ParisAttacks2015 (17).txt'\t tohoku.earthquake.tsunami.3.txt\r\n", "'ParisAttacks2015 (18).txt'\t tohoku.earthquake.tsunami.8.txt\r\n", "'ParisAttacks2015 (19).txt'\t tohoku.earthquake.tsunami.9.txt\r\n", "'ParisAttacks2015 (1).txt'\t U2.149.txt\r\n", "'ParisAttacks2015 (20).txt'\t wall.street.51.txt\r\n" ] } ], "source": [ "! ls WikiRef_dataset/WikiRef220" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "import nltk\n", "import string\n", "\n", "import pandas as pd\n", "from glob import glob\n", "\n" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "nltk.data.path.append('/home/evgenyegorov/nltk_data/')" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [], "source": [ "dir_name = \"WikiRef_dataset/WikiRef220\"\n", "files = glob(dir_name + '/*.txt')\n" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "The secret behind poor smartphone battery life\r", "\r\n", "\r", "\r\n", "Summary:That shiny new smartphone has a battery that easily lasts all day. Then it all goes downhill from there\r", "\r\n", "\r", "\r\n", "Rapidly deteriorating battery life has long been tolerated by smartphone owners, mainly because they have no choice. Phones usually meet the holy grail of battery life, the ability to last all day, at first. Then it starts getting worse quickly, until many phone owners end up plugging in during the day. It seems it's always been this way, and there's no incentive to the smartphone makers to make this better.\r", "\r\n", "\r", "\r\n", "Battery has long been the focus of the mobile segment. In the early days of mobile the possible time away from a power outlet was not very long. This prompted OEMs to do what they could to make batteries in mobile devices last longer.\r", "\r\n", "\r", "\r\n", "This is why we now have decent battery life with most laptops and tablets. The eight hours needed to go all day no matter what comes up is now met by many mobile devices. Except the smartphone.\r", "\r\n", "\r", "\r\n", "Pick any smartphone in the top ranks and talk to those who've owned one for a while. Odds are, you'll hear a familiar tale � the phone easily lasted all day when new but that didn't last. If they've owned the phone over a year you'll probably hear about the antics they go through daily to have a phone that is still working at day's end.\r", "\r\n", "\r", "\r\n", "See related: Falling tablet sales: The problem is they're just too good | Necessary battery life for laptops: 8 hours\r", "\r\n", "\r", "\r\n", "So why hasn't smartphone battery life improved like that on tablets and laptops? I believe the reason is obvious � there is no incentive for OEMs to improve this. In fact, there's an incentive to keep things the way they've always been.\r", "\r\n", "\r", "\r\n", "In a recent article I explained that tablet owners aren't upgrading because current devices are good enough . There's no compelling reason for tablet owners to get the latest and greatest because the one they have is good enough.\r", "\r\n", "\r", "\r\n", "Good battery life is a big part of that satisfaction with tablets. Most of the top tablets get 10+ hours of battery life, and that's enough for just about everyone. Even over time when those tablet batteries start deteriorating, resulting in shorter battery life, they still last all day given the long life they start with when new.\r", "\r\n", "\r", "\r\n", "With smartphones it's a different story. Many are on the edge of all-day battery life to start, and they fall short as the battery ages. This is common. Take a stroll around the web and you'll find hundreds of articles with tips for \"how to extend the battery on gadget X\". Owners are trying to get through the day with a smartphone that's only a year old, and they'll do anything to make that happen.\r", "\r\n", "\r", "\r\n", "That includes trading in that phone when their US contract is up. I believe it's not just a hankering for the latest shiny smartphone available, it's also to get rid of their old one. They know from experience that their battery life issue will go away with a new phone, due to that new battery inside.\r", "\r\n", "\r", "\r\n", "This is why there is no incentive for smartphone makers to come up with a new battery technology with the same improvements we've seen in laptops and tablets. They need that two-year replacement cycle the phone contracts push, and the dying battery syndrome fuels that upgrade process.\r", "\r\n", "\r", "\r\n", "I'm not suggesting there's some hidden conspiracy behind the smartphone battery situation. I am pointing out that from a business perspective there is no incentive for smartphone makers to improve battery life as they have on other mobile devices. They are happy when you trade that phone in for a new one. They just hope you don't figure out what they are doing. If something's not broken, why fix it? \r", "\r\n", "\r", "\r\n" ] } ], "source": [ "! cat WikiRef_dataset/WikiRef220/smartphone.107.txt\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "from nltk.corpus import wordnet\n", "\n", "def nltk2wn_tag(nltk_tag):\n", " if nltk_tag.startswith('J'):\n", " return wordnet.ADJ\n", " elif nltk_tag.startswith('V'):\n", " return wordnet.VERB\n", " elif nltk_tag.startswith('N'):\n", " return wordnet.NOUN\n", " elif nltk_tag.startswith('R'):\n", " return wordnet.ADV\n", " else: \n", " return ''" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "from ftfy import fix_text\n", "\n", "data = []\n", "for path in files:\n", " entry = {}\n", " entry['id'] = path.split('/')[-1].rpartition(\".\")[0]\n", " with open(path, 'r', encoding=\"Latin\") as f:\n", " entry['raw_text'] = \" \".join(fix_text(line).strip() for line in f)\n", " data.append(entry)\n", "\n", "texts = pd.DataFrame(data)" ] }, { "cell_type": "raw", "metadata": {}, "source": [ "from tqdm import tqdm\n", "\n", "tokenized_text = []\n", "\n", "for text in tqdm(texts['raw_text'].values):\n", " tokens = nltk.wordpunct_tokenize(text.lower())\n", " tokenized_text.append(nltk.pos_tag(tokens))\n", "texts['tokenized'] = tokenized_text" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [], "source": [ "from nltk.stem import WordNetLemmatizer\n", "from nltk.corpus import stopwords\n", "stop = set(stopwords.words('english'))\n", "\n", "wnl = WordNetLemmatizer()\n" ] }, { "cell_type": "raw", "metadata": {}, "source": [ "from nltk.stem import WordNetLemmatizer\n", "from nltk.corpus import stopwords\n", "stop = set(stopwords.words('english'))\n", "\n", "lemmatized_text = []\n", "wnl = WordNetLemmatizer()\n", "for text in texts['tokenized'].values:\n", " lemmatized = [wnl.lemmatize(word,nltk2wn_tag(pos))\n", " if nltk2wn_tag(pos) != ''\n", " else wnl.lemmatize(word)\n", " for word, pos in text ]\n", " lemmatized = [word for word in lemmatized \n", " if word not in stop and word.isalpha()]\n", " lemmatized_text.append(lemmatized)\n", "texts['lemmatized'] = lemmatized_text" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "import sys\n", "sys.path.append('../clustering_system/')\n", "\n", "import re\n", "import os\n", "import pymorphy2\n", "from collections import Counter\n", "\n", "from preprocessing.aitexta.tokenization import RegexpTokenizer\n", "from preprocessing.aitexta.tokenization import SpacyRulesRussianTokenizer\n", "from preprocessing.aitexta.utils import FunctionalTransformer\n", "from preprocessing.aitexta.lemmatization import Pymorphy2Lemmatizer\n", "from preprocessing.aitexta.feature_filtering import StopTokensRemover\n", "from preprocessing.aitexta.feature_filtering import RegexpFilter\n", "from preprocessing.aitexta.feature_filtering import BASE_RU_NONTOKEN_REGEXPS\n", "from preprocessing.aitexta.feature_extracting import TopMineNgrammer\n", "from preprocessing.aitexta.ner_extracting import Pymorphy2NerExtractor\n", "from preprocessing.aitexta.utils import MergerTransformer\n", "from preprocessing.aitexta.pipeline import SequentialPipeline" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "this_file_path = os.path.dirname('../clustering_system/preprocessing/')\n", "STOP_TOKENS_RELATIVE_PATH = 'aitexta/feature_filtering/stop_words_files/BASE_UNIVERSAL.txt'\n", "STOP_TOKENS_FILE = os.path.join(this_file_path, STOP_TOKENS_RELATIVE_PATH)" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [], "source": [ "# TODO\n", "# specSymb = {\"«\", \"»\", \"—\", \"“\", \"-\", \"№\"}\n", "# specSymb = punctuation + \"«»—“-№\"\n", "# pattern = re.compile(\"[\" + re.escape(specSymb) + \"]\")\n" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [], "source": [ "\n", "def tokenize(text):\n", " tokens = nltk.wordpunct_tokenize(text.lower())\n", " return nltk.pos_tag(tokens)\n", "\n", "def lemmatize(text):\n", " lemmatized = [wnl.lemmatize(word,nltk2wn_tag(pos))\n", " if nltk2wn_tag(pos) != ''\n", " else wnl.lemmatize(word)\n", " for word, pos in text ]\n", " lemmatized = [word for word in lemmatized \n", " if word not in stop and word.isalpha()]\n", " return lemmatized\n", "\n", "\n", "class NGrammPreprocessor:\n", " def __init__(self, n_jobs=1, verbose=False):\n", " nltk_tokenizer = FunctionalTransformer(tokenize)\n", " nltk_lemmatizer = FunctionalTransformer(lemmatize)\n", " \n", " self.text_to_lemmatized = SequentialPipeline(steps=[\n", " ('tokenization', nltk_tokenizer),\n", " ('lemmatization', nltk_lemmatizer),\n", " ], verbose=verbose)\n", "\n", " self.lemmatized_to_ngramms = SequentialPipeline(steps=[\n", " ('ngrammer', TopMineNgrammer(stop_tokens_file=STOP_TOKENS_FILE,\n", " nontoken_regexps=None,\n", " allow_delimiters_in_ngramm=True,\n", " output_type='string_ngramm',\n", " n_jobs=n_jobs))\n", " ], verbose=verbose)\n", "\n", " def fit_transform(self, X):\n", " X = self._clean_dataframe(X)\n", "\n", " X['lemmatized'] = self.text_to_lemmatized.fit_transform(X['raw_text'].values)\n", "\n", " X['ngramms'] = self.lemmatized_to_ngramms.fit_transform(X['lemmatized'].values)\n", "\n", " for column in ['lemmatized', 'ngramms']:\n", " X[column] = X[column].apply(lambda x: \" \".join(x))\n", "\n", " return X\n", "\n", " def transform(self, X):\n", " X = self._clean_dataframe(X)\n", "\n", " X['lemmatized'] = self.text_to_lemmatized.transform(X['raw_text'].values)\n", "\n", " X['ngramms'] = self.lemmatized_to_ngramms.transform(X['lemmatized'].values)\n", "\n", " for column in ['lemmatized', 'ngramms']:\n", " X[column] = X[column].apply(lambda x: \" \".join(x))\n", "\n", " return X\n", "\n", " def _clean_dataframe(self, X):\n", " X['raw_text'] = X['raw_text'].astype(str).str.strip()\n", " X = X[X['raw_text'] != '']\n", "\n", " return X" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Start of pipeline\n", "**************************************************************************\n", "\n", "Start of tokenization fit transform\n", "End of tokenization fit transform\n", "Total time: 6.8257880210876465\n", "----------------------------------------------------------------------\n", "Start of lemmatization fit transform\n", "End of lemmatization fit transform\n", "Total time: 2.13810396194458\n", "----------------------------------------------------------------------\n", "End of pipeline\n", "Total time: 8.972941160202026\n", "----------------------------------------------------------------------\n", "**************************************************************************\n", "\n", "Start of pipeline\n", "**************************************************************************\n", "\n", "Start of ngrammer fit transform\n", "End of ngrammer fit transform\n", "Total time: 3.967646598815918\n", "----------------------------------------------------------------------\n", "End of pipeline\n", "Total time: 3.968014717102051\n", "----------------------------------------------------------------------\n", "**************************************************************************\n", "\n" ] } ], "source": [ "preprocesser = NGrammPreprocessor(n_jobs=10, verbose=True)\n", "wiki_processed = preprocesser.fit_transform(texts)" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", " | id | \n", "raw_text | \n", "lemmatized | \n", "ngramms | \n", "
---|---|---|---|---|
0 | \n", "smartphone.107 | \n", "The secret behind poor smartphone battery life... | \n", "secret behind poor smartphone battery life sum... | \n", "battery_life battery_life battery_life battery... | \n", "
1 | \n", "ParisAttacks2015 (28) | \n", "On the run from Isis: Jihadists 'targeting Par... | \n", "run isi jihadist target paris attacker salah a... | \n", "run_isi run_isi paris_attacker salah_abdeslam ... | \n", "
2 | \n", "michelle.obama.103 | \n", "Michelle Obama Goes Organic and Brings in the ... | \n", "michelle obama go organic bring bee go organic... | \n", "michelle_obama michelle_obama white_house whit... | \n", "
3 | \n", "premier.league.61 | \n", "Premiership stays on BBC The BBC has won the ... | \n", "premiership stay bbc bbc win right show premie... | \n", "premier_league premier_league premier_league p... | \n", "
4 | \n", "ParisAttacks2015 (10) | \n", "Hollande says Paris attacks 'an act of war' by... | \n", "hollande say paris attack act war islamic stat... | \n", "paris_attack act_war act_war act_war islamic_s... | \n", "
\n", " | id | \n", "raw_text | \n", "vw_text | \n", "
---|---|---|---|
0 | \n", "smartphone.107 | \n", "The secret behind poor smartphone battery life... | \n", "smartphone.107 |@lemmatized secret behind poor... | \n", "
1 | \n", "ParisAttacks2015 (28) | \n", "On the run from Isis: Jihadists 'targeting Par... | \n", "ParisAttacks2015_(28) |@lemmatized run isi jih... | \n", "
2 | \n", "michelle.obama.103 | \n", "Michelle Obama Goes Organic and Brings in the ... | \n", "michelle.obama.103 |@lemmatized michelle obama... | \n", "
3 | \n", "premier.league.61 | \n", "Premiership stays on BBC The BBC has won the ... | \n", "premier.league.61 |@lemmatized premiership sta... | \n", "
4 | \n", "ParisAttacks2015 (10) | \n", "Hollande says Paris attacks 'an act of war' by... | \n", "ParisAttacks2015_(10) |@lemmatized hollande sa... | \n", "
5 | \n", "premier.league.21 | \n", "Power of top four concerns Keegan Keegan has ... | \n", "premier.league.21 |@lemmatized power top four ... | \n", "
6 | \n", "debt.crisis.greece.263 | \n", "Not Out of The Woods Yet: Despite Progress, Eu... | \n", "debt.crisis.greece.263 |@lemmatized wood yet d... | \n", "
7 | \n", "samsung.galaxy.s5.30 | \n", "How Google Used Motorola To Smack Down Samsung... | \n", "samsung.galaxy.s5.30 |@lemmatized google use m... | \n", "
8 | \n", "smartphone.74 | \n", "Samsung Waves away a million KOREAN ELECTRONI... | \n", "smartphone.74 |@lemmatized samsung wave away m... | \n", "
9 | \n", "ParisAttacks2015 (17) | \n", "Behind François Hollande's Snap Decision at St... | \n", "ParisAttacks2015_(17) |@lemmatized behind fran... | \n", "
10 | \n", "premier.league.22 | \n", "Scudamore defends 'boring' League Premier Lea... | \n", "premier.league.22 |@lemmatized scudamore defen... | \n", "
11 | \n", "samsung.galaxy.s5.39 | \n", "Samsung Galaxy S5 Active just got a 'Sport'-y ... | \n", "samsung.galaxy.s5.39 |@lemmatized samsung gala... | \n", "
12 | \n", "cypriot.financial.crisis.36 | \n", "Criminals IN Government: Cyprus President's Fa... | \n", "cypriot.financial.crisis.36 |@lemmatized crimi... | \n", "
13 | \n", "michelle.obama.80 | \n", "Michelle Obama celebrates Chicago roots DENVE... | \n", "michelle.obama.80 |@lemmatized michelle obama ... | \n", "
14 | \n", "michelle.obama.123 | \n", "Michelle Obama, Rihanna Named To People's Best... | \n", "michelle.obama.123 |@lemmatized michelle obama... | \n", "
15 | \n", "ParisAttacks2015 (12) | \n", "France Unsure if Raid Killed Top Suspect in Pa... | \n", "ParisAttacks2015_(12) |@lemmatized france unsu... | \n", "
16 | \n", "samsung.galaxy.s5.42 | \n", "Samsung: Galaxy S5 sales stronger than Galaxy ... | \n", "samsung.galaxy.s5.42 |@lemmatized samsung gala... | \n", "
17 | \n", "samsung.galaxy.s5.29 | \n", "Google Has Strangled The Innovation Out Of The... | \n", "samsung.galaxy.s5.29 |@lemmatized google stran... | \n", "
18 | \n", "ParisAttacks2015 (32) | \n", "Paris attacks: France mobilises 115,000 securi... | \n", "ParisAttacks2015_(32) |@lemmatized paris attac... | \n", "
19 | \n", "michelle.obama.95 | \n", "First Lady Michelle Obama When people ask Fir... | \n", "michelle.obama.95 |@lemmatized first lady mich... | \n", "
20 | \n", "premier.league.89 | \n", "It's official - Tottenham have the worst defen... | \n", "premier.league.89 |@lemmatized official totten... | \n", "
21 | \n", "michelle.obama.56 | \n", "Michelle Obama: Mom First, Political Wife Seco... | \n", "michelle.obama.56 |@lemmatized michelle obama ... | \n", "
22 | \n", "barack.obama.3 | \n", "Though Obama Had to Leave to Find Himself, It ... | \n", "barack.obama.3 |@lemmatized though obama leave... | \n", "
23 | \n", "samsung.galaxy.s5.38 | \n", "Samsung Galaxy S5 Active (AT&T) By Eugene Kim... | \n", "samsung.galaxy.s5.38 |@lemmatized samsung gala... | \n", "
24 | \n", "malaysia.airlines.flight.86 | \n", "Flight 370 search area shifts after 'credible ... | \n", "malaysia.airlines.flight.86 |@lemmatized fligh... | \n", "
25 | \n", "ParisAttacks2015 (29) | \n", "Paris attacks were carried out by three groups... | \n", "ParisAttacks2015_(29) |@lemmatized paris attac... | \n", "
26 | \n", "premier.league.143 | \n", "Blues wrap up Torres deal Chelsea have compl... | \n", "premier.league.143 |@lemmatized blue wrap torr... | \n", "
27 | \n", "michelle.obama.102 | \n", "Bring Back Our Girls: Michelle Obama and Malal... | \n", "michelle.obama.102 |@lemmatized bring back gir... | \n", "
28 | \n", "ParisAttacks2015 (19) | \n", "France players praised for staying with German... | \n", "ParisAttacks2015_(19) |@lemmatized france play... | \n", "
29 | \n", "malaysia.airlines.flight.16 | \n", "Australia agrees to lead search in Indian Ocea... | \n", "malaysia.airlines.flight.16 |@lemmatized austr... | \n", "
... | \n", "... | \n", "... | \n", "... | \n", "
190 | \n", "greek.elections.117 | \n", "Samaras Takes Office as Greece's Prime Ministe... | \n", "greek.elections.117 |@lemmatized samara take o... | \n", "
191 | \n", "malaysia.airlines.flight.34 | \n", "Malaysia Airlines: experts surprised at disapp... | \n", "malaysia.airlines.flight.34 |@lemmatized malay... | \n", "
192 | \n", "stephen.hawking.275 | \n", "Stephen Hawking warns artificial intelligence ... | \n", "stephen.hawking.275 |@lemmatized stephen hawk ... | \n", "
193 | \n", "malaysia.airlines.flight.31 | \n", "Pressure Loss Is Explored in Vanishing of Jetl... | \n", "malaysia.airlines.flight.31 |@lemmatized press... | \n", "
194 | \n", "premier.league.79 | \n", "NBC wins $250m rights to broadcast English Pre... | \n", "premier.league.79 |@lemmatized nbc win right b... | \n", "
195 | \n", "ParisAttacks2015 (14) | \n", "Turkey Says It Warned France Twice About Paris... | \n", "ParisAttacks2015_(14) |@lemmatized turkey say ... | \n", "
196 | \n", "malaysia.airlines.flight.142 | \n", "Flight MH370: former Australian defence chief ... | \n", "malaysia.airlines.flight.142 |@lemmatized flig... | \n", "
197 | \n", "ParisAttacks2015 (7) | \n", "Seven Militants Led Deadly Paris Attacks Isla... | \n", "ParisAttacks2015_(7) |@lemmatized seven milita... | \n", "
198 | \n", "premier.league.92 | \n", "Vorm is man in form to save Swans There was... | \n", "premier.league.92 |@lemmatized vorm man form s... | \n", "
199 | \n", "michelle.obama.86 | \n", "Michelle Obama Dismisses Criticisms Michelle ... | \n", "michelle.obama.86 |@lemmatized michelle obama ... | \n", "
200 | \n", "premier.league.9 | \n", "1985: English teams banned after Heysel The F... | \n", "premier.league.9 |@lemmatized english team ban... | \n", "
201 | \n", "barack.obama.1 | \n", "President Barack Obama Barack Obama is the 44t... | \n", "barack.obama.1 |@lemmatized president barack o... | \n", "
202 | \n", "malaysia.airlines.flight.10 | \n", "Malaysia Airlines Flight MH370: Oil slicks in ... | \n", "malaysia.airlines.flight.10 |@lemmatized malay... | \n", "
203 | \n", "malaysia.airlines.flight.227 | \n", "MH370: Malaysia declares flight disappearance ... | \n", "malaysia.airlines.flight.227 |@lemmatized mala... | \n", "
204 | \n", "premier.league.4 | \n", "Top Soccer Leagues Get 25% Rise in TV Rights S... | \n", "premier.league.4 |@lemmatized top soccer leagu... | \n", "
205 | \n", "samsung.galaxy.s5.27 | \n", "Samsung's Galaxy S5 has an 'ultra power saving... | \n", "samsung.galaxy.s5.27 |@lemmatized samsung gala... | \n", "
206 | \n", "michelle.obama.100 | \n", "Michelle Obama Derided For Being A 'Feminist N... | \n", "michelle.obama.100 |@lemmatized michelle obama... | \n", "
207 | \n", "ParisAttacks2015 (4) | \n", "Paris Terror Attacks: Yes, Parisians are traum... | \n", "ParisAttacks2015_(4) |@lemmatized paris terror... | \n", "
208 | \n", "premier.league.63 | \n", "Premiership in new £625m TV deal The winner o... | \n", "premier.league.63 |@lemmatized premiership new... | \n", "
209 | \n", "premier.league.141 | \n", "Man City beat Chelsea to Robinho Manchester C... | \n", "premier.league.141 |@lemmatized man city beat ... | \n", "
210 | \n", "samsung.galaxy.s5.4 | \n", "Samsung Galaxy S5 review: a solid improvement,... | \n", "samsung.galaxy.s5.4 |@lemmatized samsung galax... | \n", "
211 | \n", "michelle.obama.75 | \n", "Michelle Obama Shows Her Warmer Side on 'The V... | \n", "michelle.obama.75 |@lemmatized michelle obama ... | \n", "
212 | \n", "ParisAttacks2015 (25) | \n", "EU ministers order tighter border checks in re... | \n", "ParisAttacks2015_(25) |@lemmatized eu minister... | \n", "
213 | \n", "michelle.obama.109 | \n", "Michelle Obama's Book On Growing Seeds and Hea... | \n", "michelle.obama.109 |@lemmatized michelle obama... | \n", "
214 | \n", "samsung.galaxy.s5.13 | \n", "Samsung confirms fatal camera flaw on 'limited... | \n", "samsung.galaxy.s5.13 |@lemmatized samsung conf... | \n", "
215 | \n", "michelle.obama.114 | \n", "Michelle Obama Welcomes Gay Families to Nation... | \n", "michelle.obama.114 |@lemmatized michelle obama... | \n", "
216 | \n", "malaysia.airlines.flight.32 | \n", "Malaysia jet passengers likely suffocated, Aus... | \n", "malaysia.airlines.flight.32 |@lemmatized malay... | \n", "
217 | \n", "samsung.galaxy.s5.22 | \n", "Samsung Unveils New Products from its System L... | \n", "samsung.galaxy.s5.22 |@lemmatized samsung unve... | \n", "
218 | \n", "malaysia.airlines.flight.143 | \n", "Investigators find no unusual signs among MH37... | \n", "malaysia.airlines.flight.143 |@lemmatized inve... | \n", "
219 | \n", "malaysia.airlines.flight.225 | \n", "Malaysia Airlines Flight 370: How much will fa... | \n", "malaysia.airlines.flight.225 |@lemmatized mala... | \n", "
220 rows × 3 columns
\n", "