--- /dev/null
+ GNU LESSER GENERAL PUBLIC LICENSE
+ Version 2.1, February 1999
+
+ Copyright (C) 1991, 1999 Free Software Foundation, Inc.
+ 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+[This is the first released version of the Lesser GPL. It also counts
+ as the successor of the GNU Library Public License, version 2, hence
+ the version number 2.1.]
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+freedom to share and change it. By contrast, the GNU General Public
+Licenses are intended to guarantee your freedom to share and change
+free software--to make sure the software is free for all its users.
+
+ This license, the Lesser General Public License, applies to some
+specially designated software packages--typically libraries--of the
+Free Software Foundation and other authors who decide to use it. You
+can use it too, but we suggest you first think carefully about whether
+this license or the ordinary General Public License is the better
+strategy to use in any particular case, based on the explanations below.
+
+ When we speak of free software, we are referring to freedom of use,
+not price. Our General Public Licenses are designed to make sure that
+you have the freedom to distribute copies of free software (and charge
+for this service if you wish); that you receive source code or can get
+it if you want it; that you can change the software and use pieces of
+it in new free programs; and that you are informed that you can do
+these things.
+
+ To protect your rights, we need to make restrictions that forbid
+distributors to deny you these rights or to ask you to surrender these
+rights. These restrictions translate to certain responsibilities for
+you if you distribute copies of the library or if you modify it.
+
+ For example, if you distribute copies of the library, whether gratis
+or for a fee, you must give the recipients all the rights that we gave
+you. You must make sure that they, too, receive or can get the source
+code. If you link other code with the library, you must provide
+complete object files to the recipients, so that they can relink them
+with the library after making changes to the library and recompiling
+it. And you must show them these terms so they know their rights.
+
+ We protect your rights with a two-step method: (1) we copyright the
+library, and (2) we offer you this license, which gives you legal
+permission to copy, distribute and/or modify the library.
+
+ To protect each distributor, we want to make it very clear that
+there is no warranty for the free library. Also, if the library is
+modified by someone else and passed on, the recipients should know
+that what they have is not the original version, so that the original
+author's reputation will not be affected by problems that might be
+introduced by others.
+\f
+ Finally, software patents pose a constant threat to the existence of
+any free program. We wish to make sure that a company cannot
+effectively restrict the users of a free program by obtaining a
+restrictive license from a patent holder. Therefore, we insist that
+any patent license obtained for a version of the library must be
+consistent with the full freedom of use specified in this license.
+
+ Most GNU software, including some libraries, is covered by the
+ordinary GNU General Public License. This license, the GNU Lesser
+General Public License, applies to certain designated libraries, and
+is quite different from the ordinary General Public License. We use
+this license for certain libraries in order to permit linking those
+libraries into non-free programs.
+
+ When a program is linked with a library, whether statically or using
+a shared library, the combination of the two is legally speaking a
+combined work, a derivative of the original library. The ordinary
+General Public License therefore permits such linking only if the
+entire combination fits its criteria of freedom. The Lesser General
+Public License permits more lax criteria for linking other code with
+the library.
+
+ We call this license the "Lesser" General Public License because it
+does Less to protect the user's freedom than the ordinary General
+Public License. It also provides other free software developers Less
+of an advantage over competing non-free programs. These disadvantages
+are the reason we use the ordinary General Public License for many
+libraries. However, the Lesser license provides advantages in certain
+special circumstances.
+
+ For example, on rare occasions, there may be a special need to
+encourage the widest possible use of a certain library, so that it becomes
+a de-facto standard. To achieve this, non-free programs must be
+allowed to use the library. A more frequent case is that a free
+library does the same job as widely used non-free libraries. In this
+case, there is little to gain by limiting the free library to free
+software only, so we use the Lesser General Public License.
+
+ In other cases, permission to use a particular library in non-free
+programs enables a greater number of people to use a large body of
+free software. For example, permission to use the GNU C Library in
+non-free programs enables many more people to use the whole GNU
+operating system, as well as its variant, the GNU/Linux operating
+system.
+
+ Although the Lesser General Public License is Less protective of the
+users' freedom, it does ensure that the user of a program that is
+linked with the Library has the freedom and the wherewithal to run
+that program using a modified version of the Library.
+
+ The precise terms and conditions for copying, distribution and
+modification follow. Pay close attention to the difference between a
+"work based on the library" and a "work that uses the library". The
+former contains code derived from the library, whereas the latter must
+be combined with the library in order to run.
+\f
+ GNU LESSER GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License Agreement applies to any software library or other
+program which contains a notice placed by the copyright holder or
+other authorized party saying it may be distributed under the terms of
+this Lesser General Public License (also called "this License").
+Each licensee is addressed as "you".
+
+ A "library" means a collection of software functions and/or data
+prepared so as to be conveniently linked with application programs
+(which use some of those functions and data) to form executables.
+
+ The "Library", below, refers to any such software library or work
+which has been distributed under these terms. A "work based on the
+Library" means either the Library or any derivative work under
+copyright law: that is to say, a work containing the Library or a
+portion of it, either verbatim or with modifications and/or translated
+straightforwardly into another language. (Hereinafter, translation is
+included without limitation in the term "modification".)
+
+ "Source code" for a work means the preferred form of the work for
+making modifications to it. For a library, complete source code means
+all the source code for all modules it contains, plus any associated
+interface definition files, plus the scripts used to control compilation
+and installation of the library.
+
+ Activities other than copying, distribution and modification are not
+covered by this License; they are outside its scope. The act of
+running a program using the Library is not restricted, and output from
+such a program is covered only if its contents constitute a work based
+on the Library (independent of the use of the Library in a tool for
+writing it). Whether that is true depends on what the Library does
+and what the program that uses the Library does.
+
+ 1. You may copy and distribute verbatim copies of the Library's
+complete source code as you receive it, in any medium, provided that
+you conspicuously and appropriately publish on each copy an
+appropriate copyright notice and disclaimer of warranty; keep intact
+all the notices that refer to this License and to the absence of any
+warranty; and distribute a copy of this License along with the
+Library.
+
+ You may charge a fee for the physical act of transferring a copy,
+and you may at your option offer warranty protection in exchange for a
+fee.
+\f
+ 2. You may modify your copy or copies of the Library or any portion
+of it, thus forming a work based on the Library, and copy and
+distribute such modifications or work under the terms of Section 1
+above, provided that you also meet all of these conditions:
+
+ a) The modified work must itself be a software library.
+
+ b) You must cause the files modified to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ c) You must cause the whole of the work to be licensed at no
+ charge to all third parties under the terms of this License.
+
+ d) If a facility in the modified Library refers to a function or a
+ table of data to be supplied by an application program that uses
+ the facility, other than as an argument passed when the facility
+ is invoked, then you must make a good faith effort to ensure that,
+ in the event an application does not supply such function or
+ table, the facility still operates, and performs whatever part of
+ its purpose remains meaningful.
+
+ (For example, a function in a library to compute square roots has
+ a purpose that is entirely well-defined independent of the
+ application. Therefore, Subsection 2d requires that any
+ application-supplied function or table used by this function must
+ be optional: if the application does not supply it, the square
+ root function must still compute square roots.)
+
+These requirements apply to the modified work as a whole. If
+identifiable sections of that work are not derived from the Library,
+and can be reasonably considered independent and separate works in
+themselves, then this License, and its terms, do not apply to those
+sections when you distribute them as separate works. But when you
+distribute the same sections as part of a whole which is a work based
+on the Library, the distribution of the whole must be on the terms of
+this License, whose permissions for other licensees extend to the
+entire whole, and thus to each and every part regardless of who wrote
+it.
+
+Thus, it is not the intent of this section to claim rights or contest
+your rights to work written entirely by you; rather, the intent is to
+exercise the right to control the distribution of derivative or
+collective works based on the Library.
+
+In addition, mere aggregation of another work not based on the Library
+with the Library (or with a work based on the Library) on a volume of
+a storage or distribution medium does not bring the other work under
+the scope of this License.
+
+ 3. You may opt to apply the terms of the ordinary GNU General Public
+License instead of this License to a given copy of the Library. To do
+this, you must alter all the notices that refer to this License, so
+that they refer to the ordinary GNU General Public License, version 2,
+instead of to this License. (If a newer version than version 2 of the
+ordinary GNU General Public License has appeared, then you can specify
+that version instead if you wish.) Do not make any other change in
+these notices.
+\f
+ Once this change is made in a given copy, it is irreversible for
+that copy, so the ordinary GNU General Public License applies to all
+subsequent copies and derivative works made from that copy.
+
+ This option is useful when you wish to copy part of the code of
+the Library into a program that is not a library.
+
+ 4. You may copy and distribute the Library (or a portion or
+derivative of it, under Section 2) in object code or executable form
+under the terms of Sections 1 and 2 above provided that you accompany
+it with the complete corresponding machine-readable source code, which
+must be distributed under the terms of Sections 1 and 2 above on a
+medium customarily used for software interchange.
+
+ If distribution of object code is made by offering access to copy
+from a designated place, then offering equivalent access to copy the
+source code from the same place satisfies the requirement to
+distribute the source code, even though third parties are not
+compelled to copy the source along with the object code.
+
+ 5. A program that contains no derivative of any portion of the
+Library, but is designed to work with the Library by being compiled or
+linked with it, is called a "work that uses the Library". Such a
+work, in isolation, is not a derivative work of the Library, and
+therefore falls outside the scope of this License.
+
+ However, linking a "work that uses the Library" with the Library
+creates an executable that is a derivative of the Library (because it
+contains portions of the Library), rather than a "work that uses the
+library". The executable is therefore covered by this License.
+Section 6 states terms for distribution of such executables.
+
+ When a "work that uses the Library" uses material from a header file
+that is part of the Library, the object code for the work may be a
+derivative work of the Library even though the source code is not.
+Whether this is true is especially significant if the work can be
+linked without the Library, or if the work is itself a library. The
+threshold for this to be true is not precisely defined by law.
+
+ If such an object file uses only numerical parameters, data
+structure layouts and accessors, and small macros and small inline
+functions (ten lines or less in length), then the use of the object
+file is unrestricted, regardless of whether it is legally a derivative
+work. (Executables containing this object code plus portions of the
+Library will still fall under Section 6.)
+
+ Otherwise, if the work is a derivative of the Library, you may
+distribute the object code for the work under the terms of Section 6.
+Any executables containing that work also fall under Section 6,
+whether or not they are linked directly with the Library itself.
+\f
+ 6. As an exception to the Sections above, you may also combine or
+link a "work that uses the Library" with the Library to produce a
+work containing portions of the Library, and distribute that work
+under terms of your choice, provided that the terms permit
+modification of the work for the customer's own use and reverse
+engineering for debugging such modifications.
+
+ You must give prominent notice with each copy of the work that the
+Library is used in it and that the Library and its use are covered by
+this License. You must supply a copy of this License. If the work
+during execution displays copyright notices, you must include the
+copyright notice for the Library among them, as well as a reference
+directing the user to the copy of this License. Also, you must do one
+of these things:
+
+ a) Accompany the work with the complete corresponding
+ machine-readable source code for the Library including whatever
+ changes were used in the work (which must be distributed under
+ Sections 1 and 2 above); and, if the work is an executable linked
+ with the Library, with the complete machine-readable "work that
+ uses the Library", as object code and/or source code, so that the
+ user can modify the Library and then relink to produce a modified
+ executable containing the modified Library. (It is understood
+ that the user who changes the contents of definitions files in the
+ Library will not necessarily be able to recompile the application
+ to use the modified definitions.)
+
+ b) Use a suitable shared library mechanism for linking with the
+ Library. A suitable mechanism is one that (1) uses at run time a
+ copy of the library already present on the user's computer system,
+ rather than copying library functions into the executable, and (2)
+ will operate properly with a modified version of the library, if
+ the user installs one, as long as the modified version is
+ interface-compatible with the version that the work was made with.
+
+ c) Accompany the work with a written offer, valid for at
+ least three years, to give the same user the materials
+ specified in Subsection 6a, above, for a charge no more
+ than the cost of performing this distribution.
+
+ d) If distribution of the work is made by offering access to copy
+ from a designated place, offer equivalent access to copy the above
+ specified materials from the same place.
+
+ e) Verify that the user has already received a copy of these
+ materials or that you have already sent this user a copy.
+
+ For an executable, the required form of the "work that uses the
+Library" must include any data and utility programs needed for
+reproducing the executable from it. However, as a special exception,
+the materials to be distributed need not include anything that is
+normally distributed (in either source or binary form) with the major
+components (compiler, kernel, and so on) of the operating system on
+which the executable runs, unless that component itself accompanies
+the executable.
+
+ It may happen that this requirement contradicts the license
+restrictions of other proprietary libraries that do not normally
+accompany the operating system. Such a contradiction means you cannot
+use both them and the Library together in an executable that you
+distribute.
+\f
+ 7. You may place library facilities that are a work based on the
+Library side-by-side in a single library together with other library
+facilities not covered by this License, and distribute such a combined
+library, provided that the separate distribution of the work based on
+the Library and of the other library facilities is otherwise
+permitted, and provided that you do these two things:
+
+ a) Accompany the combined library with a copy of the same work
+ based on the Library, uncombined with any other library
+ facilities. This must be distributed under the terms of the
+ Sections above.
+
+ b) Give prominent notice with the combined library of the fact
+ that part of it is a work based on the Library, and explaining
+ where to find the accompanying uncombined form of the same work.
+
+ 8. You may not copy, modify, sublicense, link with, or distribute
+the Library except as expressly provided under this License. Any
+attempt otherwise to copy, modify, sublicense, link with, or
+distribute the Library is void, and will automatically terminate your
+rights under this License. However, parties who have received copies,
+or rights, from you under this License will not have their licenses
+terminated so long as such parties remain in full compliance.
+
+ 9. You are not required to accept this License, since you have not
+signed it. However, nothing else grants you permission to modify or
+distribute the Library or its derivative works. These actions are
+prohibited by law if you do not accept this License. Therefore, by
+modifying or distributing the Library (or any work based on the
+Library), you indicate your acceptance of this License to do so, and
+all its terms and conditions for copying, distributing or modifying
+the Library or works based on it.
+
+ 10. Each time you redistribute the Library (or any work based on the
+Library), the recipient automatically receives a license from the
+original licensor to copy, distribute, link with or modify the Library
+subject to these terms and conditions. You may not impose any further
+restrictions on the recipients' exercise of the rights granted herein.
+You are not responsible for enforcing compliance by third parties with
+this License.
+\f
+ 11. If, as a consequence of a court judgment or allegation of patent
+infringement or for any other reason (not limited to patent issues),
+conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License. If you cannot
+distribute so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you
+may not distribute the Library at all. For example, if a patent
+license would not permit royalty-free redistribution of the Library by
+all those who receive copies directly or indirectly through you, then
+the only way you could satisfy both it and this License would be to
+refrain entirely from distribution of the Library.
+
+If any portion of this section is held invalid or unenforceable under any
+particular circumstance, the balance of the section is intended to apply,
+and the section as a whole is intended to apply in other circumstances.
+
+It is not the purpose of this section to induce you to infringe any
+patents or other property right claims or to contest validity of any
+such claims; this section has the sole purpose of protecting the
+integrity of the free software distribution system which is
+implemented by public license practices. Many people have made
+generous contributions to the wide range of software distributed
+through that system in reliance on consistent application of that
+system; it is up to the author/donor to decide if he or she is willing
+to distribute software through any other system and a licensee cannot
+impose that choice.
+
+This section is intended to make thoroughly clear what is believed to
+be a consequence of the rest of this License.
+
+ 12. If the distribution and/or use of the Library is restricted in
+certain countries either by patents or by copyrighted interfaces, the
+original copyright holder who places the Library under this License may add
+an explicit geographical distribution limitation excluding those countries,
+so that distribution is permitted only in or among countries not thus
+excluded. In such case, this License incorporates the limitation as if
+written in the body of this License.
+
+ 13. The Free Software Foundation may publish revised and/or new
+versions of the Lesser General Public License from time to time.
+Such new versions will be similar in spirit to the present version,
+but may differ in detail to address new problems or concerns.
+
+Each version is given a distinguishing version number. If the Library
+specifies a version number of this License which applies to it and
+"any later version", you have the option of following the terms and
+conditions either of that version or of any later version published by
+the Free Software Foundation. If the Library does not specify a
+license version number, you may choose any version ever published by
+the Free Software Foundation.
+\f
+ 14. If you wish to incorporate parts of the Library into other free
+programs whose distribution conditions are incompatible with these,
+write to the author to ask for permission. For software which is
+copyrighted by the Free Software Foundation, write to the Free
+Software Foundation; we sometimes make exceptions for this. Our
+decision will be guided by the two goals of preserving the free status
+of all derivatives of our free software and of promoting the sharing
+and reuse of software generally.
+
+ NO WARRANTY
+
+ 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
+WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
+EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
+OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
+KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
+IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
+LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
+THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
+
+ 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
+WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
+AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
+FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
+CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
+LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
+RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
+FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
+SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
+DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+\f
+ How to Apply These Terms to Your New Libraries
+
+ If you develop a new library, and you want it to be of the greatest
+possible use to the public, we recommend making it free software that
+everyone can redistribute and change. You can do so by permitting
+redistribution under these terms (or, alternatively, under the terms of the
+ordinary General Public License).
+
+ To apply these terms, attach the following notices to the library. It is
+safest to attach them to the start of each source file to most effectively
+convey the exclusion of warranty; and each file should have at least the
+"copyright" line and a pointer to where the full notice is found.
+
+ <one line to give the library's name and a brief idea of what it does.>
+ Copyright (C) <year> <name of author>
+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+
+Also add information on how to contact you by electronic and paper mail.
+
+You should also get your employer (if you work as a programmer) or your
+school, if any, to sign a "copyright disclaimer" for the library, if
+necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the
+ library `Frob' (a library for tweaking knobs) written by James Random Hacker.
+
+ <signature of Ty Coon>, 1 April 1990
+ Ty Coon, President of Vice
+
+That's all there is to it!
+
+
--- /dev/null
+PROJECT_NAME=theonering
+SOURCE_PATH=src
+SOURCE=$(shell find $(SOURCE_PATH) -iname "*.py")
+PROGRAM=$(SOURCE_PATH)/$(PROJECT_NAME).py
+DATA_TYPES=*.ini *.map *.glade *.png
+DATA=$(foreach type, $(DATA_TYPES), $(shell find $(SOURCE_PATH) -iname "$(type)"))
+OBJ=$(SOURCE:.py=.pyc)
+BUILD_PATH=./build
+TAG_FILE=~/.ctags/$(PROJECT_NAME).tags
+TODO_FILE=./TODO
+
+DEBUGGER=winpdb
+UNIT_TEST=nosetests --with-doctest -w .
+SYNTAX_TEST=support/test_syntax.py
+STYLE_TEST=../../Python/tools/pep8.py --ignore=W191,E501
+LINT_RC=./support/pylint.rc
+LINT=pylint --rcfile=$(LINT_RC)
+PROFILE_GEN=python -m cProfile -o .profile
+PROFILE_VIEW=python -m pstats .profile
+TODO_FINDER=support/todo.py
+CTAGS=ctags-exuberant
+
+.PHONY: all run profile debug test build lint tags todo clean distclean
+
+all: test
+
+run: $(OBJ)
+ $(SOURCE_PATH)/dc_glade.py
+
+profile: $(OBJ)
+ $(PROFILE_GEN) $(PROGRAM)
+ $(PROFILE_VIEW)
+
+debug: $(OBJ)
+ $(DEBUGGER) $(PROGRAM)
+
+test: $(OBJ)
+ $(UNIT_TEST)
+
+package: $(OBJ)
+ rm -Rf $(BUILD_PATH)
+ mkdir -p $(BUILD_PATH)/generic
+ cp $(SOURCE_PATH)/constants.py $(BUILD_PATH)/generic
+ $(foreach file, $(DATA), cp $(file) $(BUILD_PATH)/generic/$(subst /,-,$(file)) ; )
+ $(foreach file, $(SOURCE), cp $(file) $(BUILD_PATH)/generic/$(subst /,-,$(file)) ; )
+ cp support/$(PROJECT_NAME).manager $(BUILD_PATH)/generic
+ cp support/org.freedesktop.Telepathy.ConnectionManager.$(PROJECT_NAME).service.in $(BUILD_PATH)/generic
+ cp support/icons/*-$(PROJECT_NAME).png $(BUILD_PATH)/generic/
+ cp support/builddeb.py $(BUILD_PATH)/generic
+ cp support/py2deb.py $(BUILD_PATH)/generic
+ cp support/fake_py2deb.py $(BUILD_PATH)/generic
+ mkdir -p $(BUILD_PATH)/chinook
+ cp -R $(BUILD_PATH)/generic/* $(BUILD_PATH)/chinook
+ cd $(BUILD_PATH)/chinook ; python builddeb.py chinook
+ mkdir -p $(BUILD_PATH)/diablo
+ cp -R $(BUILD_PATH)/generic/* $(BUILD_PATH)/diablo
+ cd $(BUILD_PATH)/diablo ; python builddeb.py diablo
+ mkdir -p $(BUILD_PATH)/fremantle
+ cp -R $(BUILD_PATH)/generic/* $(BUILD_PATH)/fremantle
+ cd $(BUILD_PATH)/fremantle ; python builddeb.py fremantle
+ mkdir -p $(BUILD_PATH)/mer
+ cp -R $(BUILD_PATH)/generic/* $(BUILD_PATH)/mer
+ cd $(BUILD_PATH)/mer ; python builddeb.py mer
+
+lint: $(OBJ)
+ $(foreach file, $(SOURCE), $(LINT) $(file) ; )
+
+tags: $(TAG_FILE)
+
+todo: $(TODO_FILE)
+
+clean:
+ rm -Rf $(OBJ)
+ rm -Rf $(BUILD_PATH)
+ rm -Rf $(TODO_FILE)
+
+distclean:
+ rm -Rf $(OBJ)
+ rm -Rf $(BUILD_PATH)
+ rm -Rf $(TAG_FILE)
+ find $(SOURCE_PATH) -name "*.*~" | xargs rm -f
+ find $(SOURCE_PATH) -name "*.swp" | xargs rm -f
+ find $(SOURCE_PATH) -name "*.bak" | xargs rm -f
+ find $(SOURCE_PATH) -name ".*.swp" | xargs rm -f
+
+$(TAG_FILE): $(OBJ)
+ mkdir -p $(dir $(TAG_FILE))
+ $(CTAGS) -o $(TAG_FILE) $(SOURCE)
+
+$(TODO_FILE): $(SOURCE)
+ @- $(TODO_FINDER) $(SOURCE) > $(TODO_FILE)
+
+%.pyc: %.py
+ $(SYNTAX_TEST) $<
+
+#Makefile Debugging
+#Target to print any variable, can be added to the dependencies of any other target
+#Userfule flags for make, -d, -p, -n
+print-%: ; @$(error $* is $($*) ($(value $*)) (from $(origin $*)))
--- /dev/null
+#!/usr/bin/env python
--- /dev/null
+#!/usr/bin/env python
+
+import contact_list
+import text
+import call
--- /dev/null
+import logging
+
+import telepathy
+
+
+_moduleLogger = logging.getLogger("channel.call")
+
+
+class CallChannel(
+ telepathy.server.ChannelTypeStreamedMedia,
+ telepathy.server.ChannelInterfaceCallState,
+ ):
+
+ def __init__(self, connection):
+ telepathy.server.ChannelTypeStreamedMedia.__init__(self, connection, None)
+ telepathy.server.ChannelInterfaceGroup.__init__(self)
+ telepathy.server.ChannelInterfaceChatState.__init__(self)
+
+ def ListStreams(self):
+ """
+ For org.freedesktop.Telepathy.Channel.Type.StreamedMedia
+ """
+ return ()
+
+ def RemoveStreams(self, streams):
+ """
+ For org.freedesktop.Telepathy.Channel.Type.StreamedMedia
+ """
+ raise telepathy.NotImplemented("Cannot remove a stream")
+
+ def RequestStreamDirection(self, stream, streamDirection):
+ """
+ For org.freedesktop.Telepathy.Channel.Type.StreamedMedia
+
+ @note Since streams are short lived, not bothering to implement this
+ """
+ _moduleLogger.info("A request was made to change the stream direction")
+ raise telepathy.NotImplemented("Cannot change directions")
+
+ def RequestStreams(self, contact, streamTypes):
+ """
+ For org.freedesktop.Telepathy.Channel.Type.StreamedMedia
+
+ @returns [(Stream ID, contact, stream type, stream state, stream direction, pending send flags)]
+ """
+ for streamType in streamTypes:
+ if streamType != telepathy.constants.MEDIA_STREAM_TYPE_AUDIO:
+ raise telepathy.NotImplemented("Audio is the only stream type supported")
+
+ contactId = contact.name
+
+ addressbook = self._conn.session.addressbook
+ phones = addressbook.get_contact_details(contactId)
+ firstNumber = phones.next()
+ self._conn.session.backend.dial(firstNumber)
+
+ streamId = 0
+ streamState = telepathy.constants.MEDIA_STREAM_STATE_DISCONNECTED
+ streamDirection = telepathy.constants.MEDIA_STREAM_DIRECTION_BIDIRECTIONAL
+ pendingSendFlags = telepathy.constants.MEDIA_STREAM_PENDING_REMOTE_SEND
+ return [(streamId, contact, streamTypes[0], streamState, streamDirection, pendingSendFlags)]
+
+ def GetCallStates(self):
+ """
+ For org.freedesktop.Telepathy.Channel.Interface.CallState
+
+ Get the current call states for all contacts involved in this call.
+ @returns {Contact: telepathy.constants.CHANNEL_CALL_STATE_*}
+ """
+ return {}
--- /dev/null
+import logging
+
+import telepathy
+
+import util.go_utils as gobject_utils
+import util.coroutines as coroutines
+import handle
+
+
+_moduleLogger = logging.getLogger("channel.contact_list")
+
+
+class AbstractListChannel(
+ telepathy.server.ChannelTypeContactList,
+ telepathy.server.ChannelInterfaceGroup,
+ ):
+ "Abstract Contact List channels"
+
+ def __init__(self, connection, h):
+ telepathy.server.ChannelTypeContactList.__init__(self, connection, h)
+ telepathy.server.ChannelInterfaceGroup.__init__(self)
+
+ self._session = connection.session
+
+
+class AllContactsListChannel(AbstractListChannel):
+ """
+ The group of contacts for whom you receive presence
+ """
+
+ def __init__(self, connection, h):
+ AbstractListChannel.__init__(self, connection, h)
+ self._session.addressbook.updateSignalHandler.register_sink(
+ self._on_contacts_refreshed
+ )
+ self.GroupFlagsChanged(0, 0)
+
+ addressbook = connection.session.addressbook
+ contacts = addressbook.get_contacts()
+ self._process_refresh(addressbook, contacts, [])
+
+ @coroutines.func_sink
+ @coroutines.expand_positional
+ @gobject_utils.async
+ def _on_contacts_refreshed(self, addressbook, added, removed, changed):
+ """
+ @todo This currently filters out people not yet added to the contact
+ list. Something needs to be done about those
+ @todo This currently does not handle people with multiple phone
+ numbers, yay that'll be annoying to resolve
+ """
+ self._process_refresh(addressbook, added, removed)
+
+ def _process_refresh(self, addressbook, added, removed):
+ connection = self._conn
+ handlesAdded = [
+ handle.create_handle(connection, "contact", contactId, phoneNumber)
+ for contactId in added
+ if contactId
+ for (phoneType, phoneNumber) in addressbook.get_contact_details(contactId)
+ ]
+ handlesRemoved = [
+ handle.create_handle(connection, "contact", contactId, phoneNumber)
+ for contactId in removed
+ if contactId
+ for (phoneType, phoneNumber) in addressbook.get_contact_details(contactId)
+ ]
+ message = ""
+ actor = 0
+ reason = telepathy.CHANNEL_GROUP_CHANGE_REASON_NONE
+ self.MembersChanged(
+ message,
+ handlesAdded, handlesRemoved,
+ (), (),
+ actor,
+ reason,
+ )
+
+
+def create_contact_list_channel(connection, h):
+ if h.get_name() == 'subscribe':
+ # The group of contacts for whom you receive presence
+ ChannelClass = AllContactsListChannel
+ elif h.get_name() == 'publish':
+ # The group of contacts who may receive your presence
+ ChannelClass = AllContactsListChannel
+ elif h.get_name() == 'hide':
+ # A group of contacts who are on the publish list but are temporarily
+ # disallowed from receiving your presence
+ # This doesn't make sense to support
+ _moduleLogger.warn("Unsuported type %s" % h.get_name())
+ elif h.get_name() == 'allow':
+ # A group of contacts who may send you messages
+ # @todo Allow-List would be cool to support
+ _moduleLogger.warn("Unsuported type %s" % h.get_name())
+ elif h.get_name() == 'deny':
+ # A group of contacts who may not send you messages
+ # @todo Deny-List would be cool to support
+ _moduleLogger.warn("Unsuported type %s" % h.get_name())
+ elif h.get_name() == 'stored':
+ # On protocols where the user's contacts are stored, this contact list
+ # contains all stored contacts regardless of subscription status.
+ ChannelClass = AllContactsListChannel
+ else:
+ raise TypeError("Unknown list type : " + h.get_name())
+ return ChannelClass(connection, h)
+
+
--- /dev/null
+import time
+import logging
+
+import telepathy
+
+import handle
+
+
+_moduleLogger = logging.getLogger("channel.text")
+
+
+class TextChannel(telepathy.server.ChannelTypeText):
+ """
+ Look into implementing ChannelInterfaceMessages for rich text formatting
+ """
+
+ def __init__(self, connection, h):
+ telepathy.server.ChannelTypeText.__init__(self, connection, h)
+ self._nextRecievedId = 0
+
+ handles = []
+ # @todo Populate participants
+ self.MembersChanged('', handles, [], [], [],
+ 0, telepathy.CHANNEL_GROUP_CHANGE_REASON_NONE)
+
+ def Send(self, messageType, text):
+ if messageType != telepathy.CHANNEL_TEXT_MESSAGE_TYPE_NORMAL:
+ raise telepathy.NotImplemented("Unhandled message type")
+ # @todo implement sending message
+ self.Sent(int(time.time()), messageType, text)
+
+ def Close(self):
+ telepathy.server.ChannelTypeText.Close(self)
+ self.remove_from_connection()
+
+ def _on_message_received(self, contactId, contactNumber, message):
+ """
+ @todo Attatch this to receiving a message
+ """
+ currentReceivedId = self._nextRecievedId
+
+ timestamp = int(time.time())
+ h = handle.create_handle(self._conn, "contact", contactId, contactNumber)
+ type = telepathy.CHANNEL_TEXT_MESSAGE_TYPE_NORMAL
+ message = message.content
+
+ _moduleLogger.info("Received message from User %r" % h)
+ self.Received(id, timestamp, h, type, 0, message)
+
+ self._nextRecievedId += 1
--- /dev/null
+import weakref
+import logging
+
+import telepathy
+
+import channel
+
+
+_moduleLogger = logging.getLogger("channel_manager")
+
+
+class ChannelManager(object):
+
+ def __init__(self, connection):
+ self._connRef = weakref.ref(connection)
+ self._listChannels = weakref.WeakValueDictionary()
+ self._textChannels = weakref.WeakValueDictionary()
+ self._callChannels = weakref.WeakValueDictionary()
+
+ def close(self):
+ for chan in self._listChannels.values():
+ chan.remove_from_connection()# so that dbus lets it die.
+ for chan in self._textChannels.values():
+ chan.Close()
+ for chan in self._callChannels.values():
+ chan.Close()
+
+ def channel_for_list(self, handle, suppress_handler=False):
+ try:
+ chan = self._listChannels[handle]
+ except KeyError, e:
+ if handle.get_type() != telepathy.HANDLE_TYPE_LIST:
+ raise telepathy.NotImplemented("Only server lists are allowed")
+ _moduleLogger.debug("Requesting new contact list channel")
+
+ chan = channel.contact_list.create_contact_list_channel(self._connRef(), handle)
+ self._listChannels[handle] = chan
+ self._connRef().add_channel(chan, handle, suppress_handler)
+ return chan
+
+ def channel_for_text(self, handle, suppress_handler=False):
+ try:
+ chan = self._textChannels[handle]
+ except KeyError, e:
+ if handle.get_type() != telepathy.HANDLE_TYPE_CONTACT:
+ raise telepathy.NotImplemented("Only Contacts are allowed")
+ _moduleLogger.debug("Requesting new text channel")
+
+ chan = channel.text.TextChannel(self._connRef(), None)
+ self._textChannels[handle] = chan
+ self._connRef().add_channel(chan, handle, suppress_handler)
+ return chan
+
+ def channel_for_call(self, handle, suppress_handler=False):
+ try:
+ chan = self._callChannels[handle]
+ except KeyError, e:
+ if handle.get_type() != telepathy.HANDLE_TYPE_NONE:
+ raise telepathy.NotImplemented("Using deprecated means to create a call")
+ _moduleLogger.debug("Requesting new call channel")
+
+ chan = channel.call.CallChannel(self._connRef())
+ self._callChannels[handle] = chan
+ self._connRef().add_channel(chan, handle, suppress_handler)
+ return chan
--- /dev/null
+import weakref
+import logging
+
+import telepathy
+
+import constants
+import gvoice
+import handle
+import channel_manager
+
+
+_moduleLogger = logging.getLogger("connection")
+
+
+class TheOneRingConnection(telepathy.server.Connection):
+
+ # Overriding a base class variable
+ _mandatory_parameters = {
+ 'username' : 's',
+ 'password' : 's',
+ 'forward' : 's',
+ }
+ # Overriding a base class variable
+ _optional_parameters = {
+ }
+ _parameter_defaults = {
+ }
+
+ def __init__(self, manager, parameters):
+ try:
+ self.check_parameters(parameters)
+ account = unicode(parameters['username'])
+
+ telepathy.server.Connection.__init__(
+ self,
+ constants._telepathy_protocol_name_,
+ account,
+ constants._telepathy_implementation_name_
+ )
+
+ self._manager = weakref.proxy(manager)
+ self._credentials = (
+ parameters['username'].encode('utf-8'),
+ parameters['password'].encode('utf-8'),
+ )
+ self._callbackNumber = parameters['forward'].encode('utf-8')
+ self._channelManager = channel_manager.ChannelManager(self)
+
+ cookieFilePath = "%s/cookies.txt" % constants._data_path_
+ self._session = gvoice.session.Session(cookieFilePath)
+
+ self.set_self_handle(handle.create_handle(self, 'connection'))
+
+ _moduleLogger.info("Connection to the account %s created" % account)
+ except Exception, e:
+ _moduleLogger.exception("Failed to create Connection")
+ raise
+
+ @property
+ def manager(self):
+ return self._manager
+
+ @property
+ def session(self):
+ return self._session
+
+ @property
+ def username(self):
+ return self._credentials[0]
+
+ def handle(self, handleType, handleId):
+ self.check_handle(handleType, handleId)
+ return self._handles[handleType, handleId]
+
+ def Connect(self):
+ """
+ For org.freedesktop.telepathy.Connection
+ """
+ _moduleLogger.info("Connecting...")
+ self.StatusChanged(
+ telepathy.CONNECTION_STATUS_CONNECTING,
+ telepathy.CONNECTION_STATUS_REASON_REQUESTED
+ )
+ try:
+ self.session.login(*self._credentials)
+ self.session.backend.set_callback_number(self._callbackNumber)
+ except gvoice.backend.NetworkError, e:
+ _moduleLogger.exception("Connection Failed")
+ self.StatusChanged(
+ telepathy.CONNECTION_STATUS_DISCONNECTED,
+ telepathy.CONNECTION_STATUS_REASON_NETWORK_ERROR
+ )
+ except Exception, e:
+ _moduleLogger.exception("Connection Failed")
+ self.StatusChanged(
+ telepathy.CONNECTION_STATUS_DISCONNECTED,
+ telepathy.CONNECTION_STATUS_REASON_AUTHENTICATION_FAILED
+ )
+ else:
+ _moduleLogger.info("Connected")
+ self.StatusChanged(
+ telepathy.CONNECTION_STATUS_CONNECTED,
+ telepathy.CONNECTION_STATUS_REASON_REQUESTED
+ )
+
+ def Disconnect(self):
+ """
+ For org.freedesktop.telepathy.Connection
+ @bug Not properly logging out. Cookie files need to be per connection and removed
+ """
+ _moduleLogger.info("Disconnecting")
+ try:
+ self.session.logout()
+ _moduleLogger.info("Disconnected")
+ except Exception:
+ _moduleLogger.exception("Disconnecting Failed")
+ self.StatusChanged(
+ telepathy.CONNECTION_STATUS_DISCONNECTED,
+ telepathy.CONNECTION_STATUS_REASON_REQUESTED
+ )
+
+ def RequestChannel(self, type, handleType, handleId, suppressHandler):
+ """
+ For org.freedesktop.telepathy.Connection
+
+ @param type DBus interface name for base channel type
+ @param handleId represents a contact, list, etc according to handleType
+
+ @returns DBus object path for the channel created or retrieved
+ """
+ self.check_connected()
+ self.check_handle(handleType, handleId)
+
+ channel = None
+ channelManager = self._channelManager
+ handle = self.handle(handleType, handleId)
+
+ if type == telepathy.CHANNEL_TYPE_CONTACT_LIST:
+ _moduleLogger.info("RequestChannel ContactList")
+ channel = channelManager.channel_for_list(handle, suppressHandler)
+ elif type == telepathy.CHANNEL_TYPE_TEXT:
+ _moduleLogger.info("RequestChannel Text")
+ channel = channelManager.channel_for_text(handle, None, suppressHandler)
+ elif type == telepathy.CHANNEL_TYPE_STREAMED_MEDIA:
+ _moduleLogger.info("RequestChannel Media")
+ channel = channelManager.channel_for_text(handle, None, suppressHandler)
+ else:
+ raise telepathy.NotImplemented("unknown channel type %s" % type)
+
+ _moduleLogger.info("RequestChannel Object Path: %s" % channel._object_path)
+ return channel._object_path
+
+ def RequestHandles(self, handleType, names, sender):
+ """
+ For org.freedesktop.telepathy.Connection
+ Overiding telepathy.server.Connecton to allow custom handles
+ """
+ self.check_connected()
+ self.check_handle_type(handleType)
+
+ handles = []
+ for name in names:
+ name = name.encode('utf-8')
+ if handleType == telepathy.HANDLE_TYPE_CONTACT:
+ _moduleLogger.info("RequestHandles Contact: %s" % name)
+ h = self._create_contact_handle(name)
+ elif handleType == telepathy.HANDLE_TYPE_LIST:
+ # Support only server side (immutable) lists
+ _moduleLogger.info("RequestHandles List: %s" % name)
+ h = handle.create_handle(self, 'list', name)
+ else:
+ raise telepathy.NotAvailable('Handle type unsupported %d' % handleType)
+ handles.append(h.id)
+ self.add_client_handle(h, sender)
+ return handles
+
+ def _create_contact_handle(self, requestedHandleName):
+ """
+ @todo Determine if nay of this is really needed
+ """
+ requestedContactId, requestedContactName = handle.ContactHandle.from_handle_name(
+ requestedHandleName
+ )
+ h = handle.create_handle(self, 'contact', requestedContactId, requestedHandleName)
+ return h
+
+ def _on_invite_text(self, contactId):
+ """
+ @todo Make this work
+ """
+ h = self._create_contact_handle(contactId)
+
+ channelManager = self._channelManager
+ channel = channelManager.channel_for_text(handle)
--- /dev/null
+import logging
+
+import gobject
+import telepathy
+
+import constants
+import connection
+
+
+_moduleLogger = logging.getLogger("connection_manager")
+
+
+class TheOneRingConnectionManager(telepathy.server.ConnectionManager):
+
+ def __init__(self, shutdown_func=None):
+ telepathy.server.ConnectionManager.__init__(self, constants._telepathy_implementation_name_)
+
+ # self._protos is from super
+ self._protos[constants._telepathy_protocol_name_] = connection.TheOneRingConnection
+ self._on_shutdown = shutdown_func
+ _moduleLogger.info("Connection manager created")
+
+ def GetParameters(self, proto):
+ """
+ For org.freedesktop.telepathy.ConnectionManager
+
+ @returns the mandatory and optional parameters for creating a connection
+ """
+ if proto not in self._protos:
+ raise telepathy.NotImplemented('unknown protocol %s' % proto)
+
+ result = []
+ ConnectionClass = self._protos[proto]
+ mandatoryParameters = ConnectionClass._mandatory_parameters
+ optionalParameters = ConnectionClass._optional_parameters
+ defaultParameters = ConnectionClass._parameter_defaults
+
+ for parameterName, parameterType in mandatoryParameters.iteritems():
+ flags = telepathy.CONN_MGR_PARAM_FLAG_REQUIRED
+ if parameterName == "password":
+ flags |= telepathy.CONN_MGR_PARAM_FLAG_SECRET
+ param = (
+ parameterName,
+ flags,
+ parameterType,
+ '',
+ )
+ result.append(param)
+
+ for parameterName, parameterType in optionalParameters.iteritems():
+ if parameterName in defaultParameters:
+ flags = telepathy.CONN_MGR_PARAM_FLAG_HAS_DEFAULT
+ if parameterName == "password":
+ flags |= telepathy.CONN_MGR_PARAM_FLAG_SECRET
+ default = defaultParameters[parameterName]
+ else:
+ flags = 0
+ default = ""
+ param = (
+ parameterName,
+ flags,
+ parameterName,
+ default,
+ )
+ result.append(param)
+
+ return result
+
+ def disconnected(self, conn):
+ """
+ Overrides telepathy.server.ConnectionManager
+ """
+ result = telepathy.server.ConnectionManager.disconnected(self, conn)
+ gobject.timeout_add(5000, self.shutdown)
+
+ def quit(self):
+ """
+ Terminates all connections. Must be called upon quit
+ """
+ for connection in self._connections:
+ connection.Disconnect()
+ _moduleLogger.info("Connection manager quitting")
+
+ def _shutdown(self):
+ if (
+ self._on_shutdown is not None and
+ len(self._connections) == 0
+ ):
+ self._on_shutdown()
+ return False
--- /dev/null
+import os
+
+__pretty_app_name__ = "Telepathy-TheOneRing"
+__app_name__ = "telepathy-theonering"
+__version__ = "0.1.0"
+__build__ = 0
+__app_magic__ = 0xdeadbeef
+_data_path_ = os.path.join(os.path.expanduser("~"), ".telepathy-theonering")
+_user_settings_ = "%s/settings.ini" % _data_path_
+_telepathy_protocol_name_ = "sip"
+_telepathy_implementation_name_ = "theonering"
--- /dev/null
+#!/usr/bin/python
+
+from __future__ import with_statement
+
+import os
+import errno
+import time
+import functools
+import contextlib
+import logging
+import threading
+import Queue
+
+
+_moduleLogger = logging.getLogger("gtk_toolbox")
+
+
+@contextlib.contextmanager
+def flock(path, timeout=-1):
+ WAIT_FOREVER = -1
+ DELAY = 0.1
+ timeSpent = 0
+
+ acquired = False
+
+ while timeSpent <= timeout or timeout == WAIT_FOREVER:
+ try:
+ fd = os.open(path, os.O_CREAT | os.O_EXCL | os.O_RDWR)
+ acquired = True
+ break
+ except OSError, e:
+ if e.errno != errno.EEXIST:
+ raise
+ time.sleep(DELAY)
+ timeSpent += DELAY
+
+ assert acquired, "Failed to grab file-lock %s within timeout %d" % (path, timeout)
+
+ try:
+ yield fd
+ finally:
+ os.unlink(path)
+
+
+def make_idler(func):
+ """
+ Decorator that makes a generator-function into a function that will continue execution on next call
+ """
+ a = []
+
+ @functools.wraps(func)
+ def decorated_func(*args, **kwds):
+ if not a:
+ a.append(func(*args, **kwds))
+ try:
+ a[0].next()
+ return True
+ except StopIteration:
+ del a[:]
+ return False
+
+ return decorated_func
+
+
+def autostart(func):
+ """
+ >>> @autostart
+ ... def grep_sink(pattern):
+ ... print "Looking for %s" % pattern
+ ... while True:
+ ... line = yield
+ ... if pattern in line:
+ ... print line,
+ >>> g = grep_sink("python")
+ Looking for python
+ >>> g.send("Yeah but no but yeah but no")
+ >>> g.send("A series of tubes")
+ >>> g.send("python generators rock!")
+ python generators rock!
+ >>> g.close()
+ """
+
+ @functools.wraps(func)
+ def start(*args, **kwargs):
+ cr = func(*args, **kwargs)
+ cr.next()
+ return cr
+
+ return start
+
+
+@autostart
+def printer_sink(format = "%s"):
+ """
+ >>> pr = printer_sink("%r")
+ >>> pr.send("Hello")
+ 'Hello'
+ >>> pr.send("5")
+ '5'
+ >>> pr.send(5)
+ 5
+ >>> p = printer_sink()
+ >>> p.send("Hello")
+ Hello
+ >>> p.send("World")
+ World
+ >>> # p.throw(RuntimeError, "Goodbye")
+ >>> # p.send("Meh")
+ >>> # p.close()
+ """
+ while True:
+ item = yield
+ print format % (item, )
+
+
+@autostart
+def null_sink():
+ """
+ Good for uses like with cochain to pick up any slack
+ """
+ while True:
+ item = yield
+
+
+@autostart
+def comap(function, target):
+ """
+ >>> p = printer_sink()
+ >>> cm = comap(lambda x: x+1, p)
+ >>> cm.send((0, ))
+ 1
+ >>> cm.send((1.0, ))
+ 2.0
+ >>> cm.send((-2, ))
+ -1
+ """
+ while True:
+ try:
+ item = yield
+ mappedItem = function(*item)
+ target.send(mappedItem)
+ except Exception, e:
+ _moduleLogger.exception("Forwarding exception!")
+ target.throw(e.__class__, str(e))
+
+
+def _flush_queue(queue):
+ while not queue.empty():
+ yield queue.get()
+
+
+@autostart
+def queue_sink(queue):
+ """
+ >>> q = Queue.Queue()
+ >>> qs = queue_sink(q)
+ >>> qs.send("Hello")
+ >>> qs.send("World")
+ >>> qs.throw(RuntimeError, "Goodbye")
+ >>> qs.send("Meh")
+ >>> qs.close()
+ >>> print [i for i in _flush_queue(q)]
+ [(None, 'Hello'), (None, 'World'), (<type 'exceptions.RuntimeError'>, 'Goodbye'), (None, 'Meh'), (<type 'exceptions.GeneratorExit'>, None)]
+ """
+ while True:
+ try:
+ item = yield
+ queue.put((None, item))
+ except Exception, e:
+ queue.put((e.__class__, str(e)))
+ except GeneratorExit:
+ queue.put((GeneratorExit, None))
+ raise
+
+
+def decode_item(item, target):
+ if item[0] is None:
+ target.send(item[1])
+ return False
+ elif item[0] is GeneratorExit:
+ target.close()
+ return True
+ else:
+ target.throw(item[0], item[1])
+ return False
+
+
+def nonqueue_source(queue, target):
+ isDone = False
+ while not isDone:
+ item = queue.get()
+ isDone = decode_item(item, target)
+ while not queue.empty():
+ queue.get_nowait()
+
+
+def threaded_stage(target, thread_factory = threading.Thread):
+ messages = Queue.Queue()
+
+ run_source = functools.partial(nonqueue_source, messages, target)
+ thread = thread_factory(target=run_source)
+ thread.setDaemon(True)
+ thread.start()
+
+ # Sink running in current thread
+ return queue_sink(messages)
+
+
+def safecall(f, errorDisplay=None, default=None, exception=Exception):
+ '''
+ Returns modified f. When the modified f is called and throws an
+ exception, the default value is returned
+ '''
+ def _safecall(*args, **argv):
+ try:
+ return f(*args,**argv)
+ except exception, e:
+ if errorDisplay is not None:
+ errorDisplay.push_exception(e)
+ return default
+ return _safecall
--- /dev/null
+#!/usr/bin/python
+
+import backend
+import addressbook
+import session
--- /dev/null
+#!/usr/bin/python
+
+
+import logging
+
+import util.coroutines as coroutines
+
+
+_moduleLogger = logging.getLogger("gvoice.addressbook")
+
+
+class Addressbook(object):
+
+ def __init__(self, backend):
+ self._backend = backend
+ self._contacts = {}
+ self._addedContacts = set()
+ self._removedContacts = set()
+ self._changedContacts = set()
+
+ self.updateSignalHandler = coroutines.CoTee()
+ self.update()
+
+ def update(self, force=False):
+ if not force and self._contacts:
+ return
+ oldContacts = self._contacts
+ oldContactIds = set(self.get_contacts())
+
+ self._contacts = {}
+ self._populate_contacts()
+ newContactIds = set(self.get_contacts())
+
+ self._addedContacts = newContactIds - oldContactIds
+ self._removedContacts = oldContactIds - newContactIds
+ self._changedContacts = set(
+ contactId
+ for contactId in newContactIds.intersection(oldContactIds)
+ if self._has_contact_changed(contactId, oldContacts)
+ )
+
+ if self._addedContacts or self._removedContacts or self._changedContacts:
+ message = self, self._addedContacts, self._removedContacts, self._changedContacts
+ self.updateSignalHandler.stage.send(message)
+
+ def get_contacts(self):
+ return self._contacts.iterkeys()
+
+ def get_contact_name(self, contactId):
+ return self._contacts[contactId][0]
+
+ def get_contact_details(self, contactId):
+ self._populate_contact_details(contactId)
+ return self._get_contact_details(contactId)
+
+ def _populate_contacts(self):
+ if self._contacts:
+ return
+ contacts = self._backend.get_contacts()
+ for contactId, contactName in contacts:
+ self._contacts[contactId] = (contactName, [])
+
+ def _populate_contact_details(self, contactId):
+ if self._get_contact_details(contactId):
+ return
+ self._get_contact_details(contactId).extend(
+ self._backend.get_contact_details(contactId)
+ )
+
+ def _get_contact_details(self, contactId):
+ return self._contacts[contactId][1]
+
+ def _has_contact_changed(self, contactId, oldContacts):
+ oldContact = oldContacts[contactId]
+ oldContactName = oldContact[0]
+ oldContactDetails = oldContact[1]
+ if oldContactName != self.get_contact_name(contactId):
+ return True
+ if not oldContactDetails:
+ return False
+ # if its already in the old cache, purposefully add it into the new cache
+ return oldContactDetails != self.get_contact_details(contactId)
--- /dev/null
+#!/usr/bin/python
+
+"""
+DialCentral - Front end for Google's GoogleVoice service.
+Copyright (C) 2008 Eric Warnke ericew AT gmail DOT com
+
+This library is free software; you can redistribute it and/or
+modify it under the terms of the GNU Lesser General Public
+License as published by the Free Software Foundation; either
+version 2.1 of the License, or (at your option) any later version.
+
+This library is distributed in the hope that it will be useful,
+but WITHOUT ANY WARRANTY; without even the implied warranty of
+MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+Lesser General Public License for more details.
+
+You should have received a copy of the GNU Lesser General Public
+License along with this library; if not, write to the Free Software
+Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+
+Google Voice backend code
+
+Resources
+ http://thatsmith.com/2009/03/google-voice-addon-for-firefox/
+ http://posttopic.com/topic/google-voice-add-on-development
+"""
+
+
+import os
+import re
+import urllib
+import urllib2
+import time
+import datetime
+import itertools
+import logging
+from xml.sax import saxutils
+
+from xml.etree import ElementTree
+
+import browser_emu
+
+try:
+ import simplejson
+except ImportError:
+ simplejson = None
+
+
+_moduleLogger = logging.getLogger("gvoice.backend")
+_TRUE_REGEX = re.compile("true")
+_FALSE_REGEX = re.compile("false")
+
+
+def safe_eval(s):
+ s = _TRUE_REGEX.sub("True", s)
+ s = _FALSE_REGEX.sub("False", s)
+ return eval(s, {}, {})
+
+
+if simplejson is None:
+ def parse_json(flattened):
+ return safe_eval(flattened)
+else:
+ def parse_json(flattened):
+ return simplejson.loads(flattened)
+
+
+def itergroup(iterator, count, padValue = None):
+ """
+ Iterate in groups of 'count' values. If there
+ aren't enough values, the last result is padded with
+ None.
+
+ >>> for val in itergroup([1, 2, 3, 4, 5, 6], 3):
+ ... print tuple(val)
+ (1, 2, 3)
+ (4, 5, 6)
+ >>> for val in itergroup([1, 2, 3, 4, 5, 6], 3):
+ ... print list(val)
+ [1, 2, 3]
+ [4, 5, 6]
+ >>> for val in itergroup([1, 2, 3, 4, 5, 6, 7], 3):
+ ... print tuple(val)
+ (1, 2, 3)
+ (4, 5, 6)
+ (7, None, None)
+ >>> for val in itergroup("123456", 3):
+ ... print tuple(val)
+ ('1', '2', '3')
+ ('4', '5', '6')
+ >>> for val in itergroup("123456", 3):
+ ... print repr("".join(val))
+ '123'
+ '456'
+ """
+ paddedIterator = itertools.chain(iterator, itertools.repeat(padValue, count-1))
+ nIterators = (paddedIterator, ) * count
+ return itertools.izip(*nIterators)
+
+
+class NetworkError(RuntimeError):
+ pass
+
+
+class GVoiceBackend(object):
+ """
+ This class encapsulates all of the knowledge necessary to interact with the GoogleVoice servers
+ the functions include login, setting up a callback number, and initalting a callback
+ """
+
+ def __init__(self, cookieFile = None):
+ # Important items in this function are the setup of the browser emulation and cookie file
+ self._browser = browser_emu.MozillaEmulator(1)
+ if cookieFile is None:
+ cookieFile = os.path.join(os.path.expanduser("~"), ".gv_cookies.txt")
+ self._browser.cookies.filename = cookieFile
+ if os.path.isfile(cookieFile):
+ self._browser.cookies.load()
+
+ self._token = ""
+ self._accountNum = ""
+ self._lastAuthed = 0.0
+ self._callbackNumber = ""
+ self._callbackNumbers = {}
+
+ def is_authed(self, force = False):
+ """
+ Attempts to detect a current session
+ @note Once logged in try not to reauth more than once a minute.
+ @returns If authenticated
+ """
+ if (time.time() - self._lastAuthed) < 120 and not force:
+ return True
+
+ try:
+ page = self._browser.download(self._forwardURL)
+ self._grab_account_info(page)
+ except Exception, e:
+ _moduleLogger.exception(str(e))
+ return False
+
+ self._browser.cookies.save()
+ self._lastAuthed = time.time()
+ return True
+
+ _tokenURL = "http://www.google.com/voice/m"
+ _loginURL = "https://www.google.com/accounts/ServiceLoginAuth"
+ _galxRe = re.compile(r"""<input.*?name="GALX".*?value="(.*?)".*?/>""", re.MULTILINE | re.DOTALL)
+
+ def login(self, username, password):
+ """
+ Attempt to login to GoogleVoice
+ @returns Whether login was successful or not
+ """
+ try:
+ tokenPage = self._browser.download(self._tokenURL)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % self._loginURL)
+ galxTokens = self._galxRe.search(tokenPage)
+ if galxTokens is not None:
+ galxToken = galxTokens.group(1)
+ else:
+ galxToken = ""
+ _moduleLogger.debug("Could not grab GALX token")
+
+ loginPostData = urllib.urlencode({
+ 'Email' : username,
+ 'Passwd' : password,
+ 'service': "grandcentral",
+ "ltmpl": "mobile",
+ "btmpl": "mobile",
+ "PersistentCookie": "yes",
+ "GALX": galxToken,
+ "continue": self._forwardURL,
+ })
+
+ try:
+ loginSuccessOrFailurePage = self._browser.download(self._loginURL, loginPostData)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % self._loginURL)
+
+ try:
+ self._grab_account_info(loginSuccessOrFailurePage)
+ except Exception, e:
+ _moduleLogger.exception(str(e))
+ return False
+
+ self._browser.cookies.save()
+ self._lastAuthed = time.time()
+ return True
+
+ def logout(self):
+ self._lastAuthed = 0.0
+ self._browser.cookies.clear()
+ self._browser.cookies.save()
+
+ _gvDialingStrRe = re.compile("This may take a few seconds", re.M)
+ _clicktocallURL = "https://www.google.com/voice/m/sendcall"
+
+ def dial(self, number):
+ """
+ This is the main function responsible for initating the callback
+ """
+ number = self._send_validation(number)
+ try:
+ clickToCallData = urllib.urlencode({
+ "number": number,
+ "phone": self._callbackNumber,
+ "_rnr_se": self._token,
+ })
+ otherData = {
+ 'Referer' : 'https://google.com/voice/m/callsms',
+ }
+ callSuccessPage = self._browser.download(self._clicktocallURL, clickToCallData, None, otherData)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % self._clicktocallURL)
+
+ if self._gvDialingStrRe.search(callSuccessPage) is None:
+ raise RuntimeError("Google Voice returned an error")
+
+ return True
+
+ _sendSmsURL = "https://www.google.com/voice/m/sendsms"
+
+ def send_sms(self, number, message):
+ number = self._send_validation(number)
+ try:
+ smsData = urllib.urlencode({
+ "number": number,
+ "smstext": message,
+ "_rnr_se": self._token,
+ "id": "undefined",
+ "c": "undefined",
+ })
+ otherData = {
+ 'Referer' : 'https://google.com/voice/m/sms',
+ }
+ smsSuccessPage = self._browser.download(self._sendSmsURL, smsData, None, otherData)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % self._sendSmsURL)
+
+ return True
+
+ _validateRe = re.compile("^[0-9]{10,}$")
+
+ def is_valid_syntax(self, number):
+ """
+ @returns If This number be called ( syntax validation only )
+ """
+ return self._validateRe.match(number) is not None
+
+ def get_account_number(self):
+ """
+ @returns The GoogleVoice phone number
+ """
+ return self._accountNum
+
+ def get_callback_numbers(self):
+ """
+ @returns a dictionary mapping call back numbers to descriptions
+ @note These results are cached for 30 minutes.
+ """
+ if not self.is_authed():
+ return {}
+ return self._callbackNumbers
+
+ _setforwardURL = "https://www.google.com//voice/m/setphone"
+
+ def set_callback_number(self, callbacknumber):
+ """
+ Set the number that GoogleVoice calls
+ @param callbacknumber should be a proper 10 digit number
+ """
+ self._callbackNumber = callbacknumber
+ return True
+
+ def get_callback_number(self):
+ """
+ @returns Current callback number or None
+ """
+ return self._callbackNumber
+
+ _recentCallsURL = "https://www.google.com/voice/inbox/recent/"
+ _placedCallsURL = "https://www.google.com/voice/inbox/recent/placed/"
+ _receivedCallsURL = "https://www.google.com/voice/inbox/recent/received/"
+ _missedCallsURL = "https://www.google.com/voice/inbox/recent/missed/"
+
+ def get_recent(self):
+ """
+ @returns Iterable of (personsName, phoneNumber, exact date, relative date, action)
+ """
+ for action, url in (
+ ("Received", self._receivedCallsURL),
+ ("Missed", self._missedCallsURL),
+ ("Placed", self._placedCallsURL),
+ ):
+ try:
+ flatXml = self._browser.download(url)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % url)
+
+ allRecentHtml = self._grab_html(flatXml)
+ allRecentData = self._parse_voicemail(allRecentHtml)
+ for recentCallData in allRecentData:
+ recentCallData["action"] = action
+ yield recentCallData
+
+ _contactsRe = re.compile(r"""<a href="/voice/m/contact/(\d+)">(.*?)</a>""", re.S)
+ _contactsNextRe = re.compile(r""".*<a href="/voice/m/contacts(\?p=\d+)">Next.*?</a>""", re.S)
+ _contactsURL = "https://www.google.com/voice/mobile/contacts"
+
+ def get_contacts(self):
+ """
+ @returns Iterable of (contact id, contact name)
+ """
+ contactsPagesUrls = [self._contactsURL]
+ for contactsPageUrl in contactsPagesUrls:
+ try:
+ contactsPage = self._browser.download(contactsPageUrl)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % contactsPageUrl)
+ for contact_match in self._contactsRe.finditer(contactsPage):
+ contactId = contact_match.group(1)
+ contactName = saxutils.unescape(contact_match.group(2))
+ contact = contactId, contactName
+ yield contact
+
+ next_match = self._contactsNextRe.match(contactsPage)
+ if next_match is not None:
+ newContactsPageUrl = self._contactsURL + next_match.group(1)
+ contactsPagesUrls.append(newContactsPageUrl)
+
+ _contactDetailPhoneRe = re.compile(r"""<div.*?>([0-9+\-\(\) \t]+?)<span.*?>\((\w+)\)</span>""", re.S)
+ _contactDetailURL = "https://www.google.com/voice/mobile/contact"
+
+ def get_contact_details(self, contactId):
+ """
+ @returns Iterable of (Phone Type, Phone Number)
+ """
+ try:
+ detailPage = self._browser.download(self._contactDetailURL + '/' + contactId)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % self._contactDetailURL)
+
+ for detail_match in self._contactDetailPhoneRe.finditer(detailPage):
+ phoneNumber = detail_match.group(1)
+ phoneType = saxutils.unescape(detail_match.group(2))
+ yield (phoneType, phoneNumber)
+
+ _voicemailURL = "https://www.google.com/voice/inbox/recent/voicemail/"
+ _smsURL = "https://www.google.com/voice/inbox/recent/sms/"
+
+ def get_messages(self):
+ try:
+ voicemailPage = self._browser.download(self._voicemailURL)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % self._voicemailURL)
+ voicemailHtml = self._grab_html(voicemailPage)
+ parsedVoicemail = self._parse_voicemail(voicemailHtml)
+ decoratedVoicemails = self._decorate_voicemail(parsedVoicemail)
+
+ try:
+ smsPage = self._browser.download(self._smsURL)
+ except urllib2.URLError, e:
+ _moduleLogger.exception("Translating error: %s" % str(e))
+ raise NetworkError("%s is not accesible" % self._smsURL)
+ smsHtml = self._grab_html(smsPage)
+ parsedSms = self._parse_sms(smsHtml)
+ decoratedSms = self._decorate_sms(parsedSms)
+
+ allMessages = itertools.chain(decoratedVoicemails, decoratedSms)
+ return allMessages
+
+ def _grab_json(self, flatXml):
+ xmlTree = ElementTree.fromstring(flatXml)
+ jsonElement = xmlTree.getchildren()[0]
+ flatJson = jsonElement.text
+ jsonTree = parse_json(flatJson)
+ return jsonTree
+
+ def _grab_html(self, flatXml):
+ xmlTree = ElementTree.fromstring(flatXml)
+ htmlElement = xmlTree.getchildren()[1]
+ flatHtml = htmlElement.text
+ return flatHtml
+
+ _tokenRe = re.compile(r"""<input.*?name="_rnr_se".*?value="(.*?)"\s*/>""")
+ _accountNumRe = re.compile(r"""<b class="ms\d">(.{14})</b></div>""")
+ _callbackRe = re.compile(r"""\s+(.*?):\s*(.*?)<br\s*/>\s*$""", re.M)
+ _forwardURL = "https://www.google.com/voice/mobile/phones"
+
+ def _grab_account_info(self, page):
+ tokenGroup = self._tokenRe.search(page)
+ if tokenGroup is None:
+ raise RuntimeError("Could not extract authentication token from GoogleVoice")
+ self._token = tokenGroup.group(1)
+
+ anGroup = self._accountNumRe.search(page)
+ if anGroup is not None:
+ self._accountNum = anGroup.group(1)
+ else:
+ _moduleLogger.debug("Could not extract account number from GoogleVoice")
+
+ self._callbackNumbers = {}
+ for match in self._callbackRe.finditer(page):
+ callbackNumber = match.group(2)
+ callbackName = match.group(1)
+ self._callbackNumbers[callbackNumber] = callbackName
+ if len(self._callbackNumbers) == 0:
+ _moduleLogger.debug("Could not extract callback numbers from GoogleVoice (the troublesome page follows):\n%s" % page)
+
+ def _send_validation(self, number):
+ if not self.is_valid_syntax(number):
+ raise ValueError('Number is not valid: "%s"' % number)
+ elif not self.is_authed():
+ raise RuntimeError("Not Authenticated")
+
+ if len(number) == 11 and number[0] == 1:
+ # Strip leading 1 from 11 digit dialing
+ number = number[1:]
+ return number
+
+ _seperateVoicemailsRegex = re.compile(r"""^\s*<div id="(\w+)"\s* class=".*?gc-message.*?">""", re.MULTILINE | re.DOTALL)
+ _exactVoicemailTimeRegex = re.compile(r"""<span class="gc-message-time">(.*?)</span>""", re.MULTILINE)
+ _relativeVoicemailTimeRegex = re.compile(r"""<span class="gc-message-relative">(.*?)</span>""", re.MULTILINE)
+ _voicemailNameRegex = re.compile(r"""<a class=.*?gc-message-name-link.*?>(.*?)</a>""", re.MULTILINE | re.DOTALL)
+ _voicemailNumberRegex = re.compile(r"""<input type="hidden" class="gc-text gc-quickcall-ac" value="(.*?)"/>""", re.MULTILINE)
+ _prettyVoicemailNumberRegex = re.compile(r"""<span class="gc-message-type">(.*?)</span>""", re.MULTILINE)
+ _voicemailLocationRegex = re.compile(r"""<span class="gc-message-location">.*?<a.*?>(.*?)</a></span>""", re.MULTILINE)
+ _messagesContactID = re.compile(r"""<a class=".*?gc-message-name-link.*?">.*?</a>\s*?<span .*?>(.*?)</span>""", re.MULTILINE)
+ #_voicemailMessageRegex = re.compile(r"""<span id="\d+-\d+" class="gc-word-(.*?)">(.*?)</span>""", re.MULTILINE)
+ #_voicemailMessageRegex = re.compile(r"""<a .*? class="gc-message-mni">(.*?)</a>""", re.MULTILINE)
+ _voicemailMessageRegex = re.compile(r"""(<span id="\d+-\d+" class="gc-word-(.*?)">(.*?)</span>|<a .*? class="gc-message-mni">(.*?)</a>)""", re.MULTILINE)
+
+ @staticmethod
+ def _interpret_voicemail_regex(group):
+ quality, content, number = group.group(2), group.group(3), group.group(4)
+ if quality is not None and content is not None:
+ return quality, content
+ elif number is not None:
+ return "high", number
+
+ def _parse_voicemail(self, voicemailHtml):
+ splitVoicemail = self._seperateVoicemailsRegex.split(voicemailHtml)
+ for messageId, messageHtml in itergroup(splitVoicemail[1:], 2):
+ exactTimeGroup = self._exactVoicemailTimeRegex.search(messageHtml)
+ exactTime = exactTimeGroup.group(1).strip() if exactTimeGroup else ""
+ exactTime = datetime.datetime.strptime(exactTime, "%m/%d/%y %I:%M %p")
+ relativeTimeGroup = self._relativeVoicemailTimeRegex.search(messageHtml)
+ relativeTime = relativeTimeGroup.group(1).strip() if relativeTimeGroup else ""
+ locationGroup = self._voicemailLocationRegex.search(messageHtml)
+ location = locationGroup.group(1).strip() if locationGroup else ""
+
+ nameGroup = self._voicemailNameRegex.search(messageHtml)
+ name = nameGroup.group(1).strip() if nameGroup else ""
+ numberGroup = self._voicemailNumberRegex.search(messageHtml)
+ number = numberGroup.group(1).strip() if numberGroup else ""
+ prettyNumberGroup = self._prettyVoicemailNumberRegex.search(messageHtml)
+ prettyNumber = prettyNumberGroup.group(1).strip() if prettyNumberGroup else ""
+ contactIdGroup = self._messagesContactID.search(messageHtml)
+ contactId = contactIdGroup.group(1).strip() if contactIdGroup else ""
+
+ messageGroups = self._voicemailMessageRegex.finditer(messageHtml)
+ messageParts = (
+ self._interpret_voicemail_regex(group)
+ for group in messageGroups
+ ) if messageGroups else ()
+
+ yield {
+ "id": messageId.strip(),
+ "contactId": contactId,
+ "name": name,
+ "time": exactTime,
+ "relTime": relativeTime,
+ "prettyNumber": prettyNumber,
+ "number": number,
+ "location": location,
+ "messageParts": messageParts,
+ }
+
+ def _decorate_voicemail(self, parsedVoicemails):
+ messagePartFormat = {
+ "med1": "<i>%s</i>",
+ "med2": "%s",
+ "high": "<b>%s</b>",
+ }
+ for voicemailData in parsedVoicemails:
+ message = " ".join((
+ messagePartFormat[quality] % part
+ for (quality, part) in voicemailData["messageParts"]
+ )).strip()
+ if not message:
+ message = "No Transcription"
+ whoFrom = voicemailData["name"]
+ when = voicemailData["time"]
+ voicemailData["messageParts"] = ((whoFrom, message, when), )
+ yield voicemailData
+
+ _smsFromRegex = re.compile(r"""<span class="gc-message-sms-from">(.*?)</span>""", re.MULTILINE | re.DOTALL)
+ _smsTimeRegex = re.compile(r"""<span class="gc-message-sms-time">(.*?)</span>""", re.MULTILINE | re.DOTALL)
+ _smsTextRegex = re.compile(r"""<span class="gc-message-sms-text">(.*?)</span>""", re.MULTILINE | re.DOTALL)
+
+ def _parse_sms(self, smsHtml):
+ splitSms = self._seperateVoicemailsRegex.split(smsHtml)
+ for messageId, messageHtml in itergroup(splitSms[1:], 2):
+ exactTimeGroup = self._exactVoicemailTimeRegex.search(messageHtml)
+ exactTime = exactTimeGroup.group(1).strip() if exactTimeGroup else ""
+ exactTime = datetime.datetime.strptime(exactTime, "%m/%d/%y %I:%M %p")
+ relativeTimeGroup = self._relativeVoicemailTimeRegex.search(messageHtml)
+ relativeTime = relativeTimeGroup.group(1).strip() if relativeTimeGroup else ""
+
+ nameGroup = self._voicemailNameRegex.search(messageHtml)
+ name = nameGroup.group(1).strip() if nameGroup else ""
+ numberGroup = self._voicemailNumberRegex.search(messageHtml)
+ number = numberGroup.group(1).strip() if numberGroup else ""
+ prettyNumberGroup = self._prettyVoicemailNumberRegex.search(messageHtml)
+ prettyNumber = prettyNumberGroup.group(1).strip() if prettyNumberGroup else ""
+ contactIdGroup = self._messagesContactID.search(messageHtml)
+ contactId = contactIdGroup.group(1).strip() if contactIdGroup else ""
+
+ fromGroups = self._smsFromRegex.finditer(messageHtml)
+ fromParts = (group.group(1).strip() for group in fromGroups)
+ textGroups = self._smsTextRegex.finditer(messageHtml)
+ textParts = (group.group(1).strip() for group in textGroups)
+ timeGroups = self._smsTimeRegex.finditer(messageHtml)
+ timeParts = (group.group(1).strip() for group in timeGroups)
+
+ messageParts = itertools.izip(fromParts, textParts, timeParts)
+
+ yield {
+ "id": messageId.strip(),
+ "contactId": contactId,
+ "name": name,
+ "time": exactTime,
+ "relTime": relativeTime,
+ "prettyNumber": prettyNumber,
+ "number": number,
+ "location": "",
+ "messageParts": messageParts,
+ }
+
+ def _decorate_sms(self, parsedTexts):
+ return parsedTexts
+
+
+def set_sane_callback(backend):
+ """
+ Try to set a sane default callback number on these preferences
+ 1) 1747 numbers ( Gizmo )
+ 2) anything with gizmo in the name
+ 3) anything with computer in the name
+ 4) the first value
+ """
+ numbers = backend.get_callback_numbers()
+
+ priorityOrderedCriteria = [
+ ("1747", None),
+ (None, "gizmo"),
+ (None, "computer"),
+ (None, "sip"),
+ (None, None),
+ ]
+
+ for numberCriteria, descriptionCriteria in priorityOrderedCriteria:
+ for number, description in numbers.iteritems():
+ if numberCriteria is not None and re.compile(numberCriteria).match(number) is None:
+ continue
+ if descriptionCriteria is not None and re.compile(descriptionCriteria).match(description) is None:
+ continue
+ backend.set_callback_number(number)
+ return
+
+
+def sort_messages(allMessages):
+ sortableAllMessages = [
+ (message["time"], message)
+ for message in allMessages
+ ]
+ sortableAllMessages.sort(reverse=True)
+ return (
+ message
+ for (exactTime, message) in sortableAllMessages
+ )
+
+
+def decorate_recent(recentCallData):
+ """
+ @returns (personsName, phoneNumber, date, action)
+ """
+ contactId = recentCallData["contactId"]
+ if recentCallData["name"]:
+ header = recentCallData["name"]
+ elif recentCallData["prettyNumber"]:
+ header = recentCallData["prettyNumber"]
+ elif recentCallData["location"]:
+ header = recentCallData["location"]
+ else:
+ header = "Unknown"
+
+ number = recentCallData["number"]
+ relTime = recentCallData["relTime"]
+ action = recentCallData["action"]
+ return contactId, header, number, relTime, action
+
+
+def decorate_message(messageData):
+ contactId = messageData["contactId"]
+ exactTime = messageData["time"]
+ if messageData["name"]:
+ header = messageData["name"]
+ elif messageData["prettyNumber"]:
+ header = messageData["prettyNumber"]
+ else:
+ header = "Unknown"
+ number = messageData["number"]
+ relativeTime = messageData["relTime"]
+
+ messageParts = list(messageData["messageParts"])
+ if len(messageParts) == 0:
+ messages = ("No Transcription", )
+ elif len(messageParts) == 1:
+ messages = (messageParts[0][1], )
+ else:
+ messages = [
+ "<b>%s</b>: %s" % (messagePart[0], messagePart[1])
+ for messagePart in messageParts
+ ]
+
+ decoratedResults = contactId, header, number, relativeTime, messages
+ return decoratedResults
+
+
+def test_backend(username, password):
+ backend = GVoiceBackend()
+ print "Authenticated: ", backend.is_authed()
+ if not backend.is_authed():
+ print "Login?: ", backend.login(username, password)
+ print "Authenticated: ", backend.is_authed()
+
+ #print "Token: ", backend._token
+ #print "Account: ", backend.get_account_number()
+ #print "Callback: ", backend.get_callback_number()
+ #print "All Callback: ",
+ #import pprint
+ #pprint.pprint(backend.get_callback_numbers())
+
+ #print "Recent: "
+ #for data in backend.get_recent():
+ # pprint.pprint(data)
+ #for data in sort_messages(backend.get_recent()):
+ # pprint.pprint(decorate_recent(data))
+ #pprint.pprint(list(backend.get_recent()))
+
+ #print "Contacts: ",
+ #for contact in backend.get_contacts():
+ # print contact
+ # pprint.pprint(list(backend.get_contact_details(contact[0])))
+
+ #print "Messages: ",
+ #for message in backend.get_messages():
+ # pprint.pprint(message)
+ #for message in sort_messages(backend.get_messages()):
+ # pprint.pprint(decorate_message(message))
+
+ return backend
+
+
+if __name__ == "__main__":
+ import sys
+ logging.basicConfig(level=logging.DEBUG)
+ test_backend(sys.argv[1], sys.argv[2])
--- /dev/null
+"""
+@author: Laszlo Nagy
+@copyright: (c) 2005 by Szoftver Messias Bt.
+@licence: BSD style
+
+Objects of the MozillaEmulator class can emulate a browser that is capable of:
+
+ - cookie management
+ - configurable user agent string
+ - GET and POST
+ - multipart POST (send files)
+ - receive content into file
+
+I have seen many requests on the python mailing list about how to emulate a browser. I'm using this class for years now, without any problems. This is how you can use it:
+
+ 1. Use firefox
+ 2. Install and open the livehttpheaders plugin
+ 3. Use the website manually with firefox
+ 4. Check the GET and POST requests in the livehttpheaders capture window
+ 5. Create an instance of the above class and send the same GET and POST requests to the server.
+
+Optional steps:
+
+ - You can change user agent string in the build_opened method
+ - The "encode_multipart_formdata" function can be used alone to create POST data from a list of field values and files
+"""
+
+import urllib2
+import cookielib
+import logging
+
+import socket
+
+
+_moduleLogger = logging.getLogger("gvoice.browser_emu")
+socket.setdefaulttimeout(10)
+
+
+class MozillaEmulator(object):
+
+ def __init__(self, trycount = 1):
+ """Create a new MozillaEmulator object.
+
+ @param trycount: The download() method will retry the operation if it fails. You can specify -1 for infinite retrying.
+ A value of 0 means no retrying. A value of 1 means one retry. etc."""
+ self.cookies = cookielib.LWPCookieJar()
+ self.debug = False
+ self.trycount = trycount
+
+ def build_opener(self, url, postdata = None, extraheaders = None, forbid_redirect = False):
+ if extraheaders is None:
+ extraheaders = {}
+
+ txheaders = {
+ 'Accept': 'text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png',
+ 'Accept-Language': 'en,en-us;q=0.5',
+ 'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.7',
+ }
+ for key, value in extraheaders.iteritems():
+ txheaders[key] = value
+ req = urllib2.Request(url, postdata, txheaders)
+ self.cookies.add_cookie_header(req)
+ if forbid_redirect:
+ redirector = HTTPNoRedirector()
+ else:
+ redirector = urllib2.HTTPRedirectHandler()
+
+ http_handler = urllib2.HTTPHandler(debuglevel=self.debug)
+ https_handler = urllib2.HTTPSHandler(debuglevel=self.debug)
+
+ u = urllib2.build_opener(
+ http_handler,
+ https_handler,
+ urllib2.HTTPCookieProcessor(self.cookies),
+ redirector
+ )
+ u.addheaders = [(
+ 'User-Agent',
+ 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.8) Gecko/20050511 Firefox/1.0.4'
+ )]
+ if not postdata is None:
+ req.add_data(postdata)
+ return (req, u)
+
+ def download(self, url,
+ postdata = None, extraheaders = None, forbid_redirect = False,
+ trycount = None, only_head = False,
+ ):
+ """Download an URL with GET or POST methods.
+
+ @param postdata: It can be a string that will be POST-ed to the URL.
+ When None is given, the method will be GET instead.
+ @param extraheaders: You can add/modify HTTP headers with a dict here.
+ @param forbid_redirect: Set this flag if you do not want to handle
+ HTTP 301 and 302 redirects.
+ @param trycount: Specify the maximum number of retries here.
+ 0 means no retry on error. Using -1 means infinite retring.
+ None means the default value (that is self.trycount).
+ @param only_head: Create the openerdirector and return it. In other
+ words, this will not retrieve any content except HTTP headers.
+
+ @return: The raw HTML page data
+ """
+ _moduleLogger.warning("Performing download of %s" % url)
+
+ if extraheaders is None:
+ extraheaders = {}
+ if trycount is None:
+ trycount = self.trycount
+ cnt = 0
+
+ while True:
+ try:
+ req, u = self.build_opener(url, postdata, extraheaders, forbid_redirect)
+ openerdirector = u.open(req)
+ if self.debug:
+ _moduleLogger.info("%r - %r" % (req.get_method(), url))
+ _moduleLogger.info("%r - %r" % (openerdirector.code, openerdirector.msg))
+ _moduleLogger.info("%r" % (openerdirector.headers))
+ self.cookies.extract_cookies(openerdirector, req)
+ if only_head:
+ return openerdirector
+
+ return self._read(openerdirector, trycount)
+ except urllib2.URLError:
+ cnt += 1
+ if (-1 < trycount) and (trycount < cnt):
+ raise
+
+ # Retry :-)
+ _moduleLogger.info("MozillaEmulator: urllib2.URLError, retryting %d" % cnt)
+
+ def _read(self, openerdirector, trycount):
+ chunks = []
+
+ chunk = openerdirector.read()
+ chunks.append(chunk)
+ #while chunk and cnt < trycount:
+ # time.sleep(1)
+ # cnt += 1
+ # chunk = openerdirector.read()
+ # chunks.append(chunk)
+
+ data = "".join(chunks)
+
+ if "Content-Length" in openerdirector.info():
+ assert len(data) == int(openerdirector.info()["Content-Length"]), "The packet header promised %s of data but only was able to read %s of data" % (
+ openerdirector.info()["Content-Length"],
+ len(data),
+ )
+
+ return data
+
+
+class HTTPNoRedirector(urllib2.HTTPRedirectHandler):
+ """This is a custom http redirect handler that FORBIDS redirection."""
+
+ def http_error_302(self, req, fp, code, msg, headers):
+ e = urllib2.HTTPError(req.get_full_url(), code, msg, headers, fp)
+ if e.code in (301, 302):
+ if 'location' in headers:
+ newurl = headers.getheaders('location')[0]
+ elif 'uri' in headers:
+ newurl = headers.getheaders('uri')[0]
+ e.newurl = newurl
+ raise e
--- /dev/null
+#!/usr/bin/python
+
+
+import logging
+
+import util.coroutines as coroutines
+
+import backend
+
+
+_moduleLogger = logging.getLogger("gvoice.conversations")
+
+
+class Conversations(object):
+
+ def __init__(self, backend):
+ self._backend = backend
+ self._conversations = {}
+
+ self.updateSignalHandler = coroutines.CoTee()
+ self.update()
+
+ def update(self, force=False):
+ if not force and self._conversations:
+ return
+
+ oldConversationIds = set(self._conversations.iterkeys())
+
+ updateConversationIds = set()
+ messages = self._backend.get_messages()
+ sortedMessages = backend.sort_messages(messages)
+ for messageData in sortedMessages:
+ key = messageData["contactId"], messageData["number"]
+ try:
+ conversation = self._conversations[key]
+ isNewConversation = False
+ except KeyError:
+ conversation = Conversation(self._backend, messageData)
+ self._conversations[key] = conversation
+ isNewConversation = True
+
+ if isNewConversation:
+ # @todo see if this has issues with a user marking a item as unread/unarchive?
+ isConversationUpdated = True
+ else:
+ isConversationUpdated = conversation.merge_conversation(messageData)
+
+ if isConversationUpdated:
+ updateConversationIds.add(key)
+
+ if updateConversationIds:
+ message = (self, updateConversationIds, )
+ self.updateSignalHandler.stage.send(message)
+
+ def get_conversations(self):
+ return self._conversations.iterkeys()
+
+ def get_conversation(self, key):
+ return self._conversations[key]
+
+
+class Conversation(object):
+
+ def __init__(self, backend, data):
+ self._backend = backend
+ self._data = dict((key, value) for (key, value) in data.iteritems())
+
+ # confirm we have a list
+ self._data["messageParts"] = list(
+ self._append_time(message, self._data["time"])
+ for message in self._data["messageParts"]
+ )
+
+ def __getitem__(self, key):
+ return self._data[key]
+
+ def merge_conversation(self, moreData):
+ """
+ @returns True if there was content to merge (new messages arrived
+ rather than being a duplicate)
+
+ @warning This assumes merges are done in chronological order
+ """
+ for constantField in ("contactId", "number"):
+ assert self._data[constantField] == moreData[constantField], "Constant field changed, soemthing is seriously messed up: %r v %r" % (self._data, moreData)
+
+ if moreData["time"] < self._data["time"]:
+ # If its older, assuming it has nothing new to report
+ return False
+
+ for preferredMoreField in ("id", "name", "time", "relTime", "prettyNumber", "location"):
+ preferredFieldValue = moreData[preferredMoreField]
+ if preferredFieldValue:
+ self._data[preferredMoreField] = preferredFieldValue
+
+ messageAppended = False
+
+ messageParts = self._data["messageParts"]
+ for message in moreData["messageParts"]:
+ messageWithTimestamp = self._append_time(message, moreData["time"])
+ if messageWithTimestamp not in messageParts:
+ messageParts.append(messageWithTimestamp)
+ messageAppended = True
+ messageParts.sort()
+
+ return messageAppended
+
+ @staticmethod
+ def _append_time(message, exactWhen):
+ whoFrom, message, when = message
+ return exactWhen, whoFrom, message, when
--- /dev/null
+#!/usr/bin/env python
+
+import logging
+
+import backend
+import addressbook
+import conversations
+
+
+_moduleLogger = logging.getLogger("gvoice.session")
+
+
+class Session(object):
+
+ def __init__(self, cookiePath):
+ self._cookiePath = cookiePath
+ self._username = None
+ self._password = None
+ self._backend = None
+ self._addressbook = None
+ self._conversations = None
+
+ def login(self, username, password):
+ self._username = username
+ self._password = password
+ self._backend = backend.GVoiceBackend(self._cookiePath)
+ if not self._backend.is_authed():
+ self._backend.login(self._username, self._password)
+
+ def logout(self):
+ self._username = None
+ self._password = None
+ self._backend = None
+ self._addressbook = None
+ self._conversations = None
+
+ def is_logged_in(self):
+ if self._backend is None:
+ return False
+ elif self._backend.is_authed():
+ return True
+ else:
+ try:
+ loggedIn = self._backend.login(self._username, self._password)
+ except RuntimeError:
+ loggedIn = False
+ if loggedIn:
+ return True
+ else:
+ self.logout()
+ return False
+
+ @property
+ def backend(self):
+ """
+ Login enforcing backend
+ """
+ assert self.is_logged_in(), "User not logged in"
+ return self._backend
+
+ @property
+ def addressbook(self):
+ """
+ Delay initialized addressbook
+ """
+ if self._addressbook is None:
+ _moduleLogger.info("Initializing addressbook")
+ self._addressbook = addressbook.Addressbook(self.backend)
+ return self._addressbook
+
+ @property
+ def conversations(self):
+ """
+ Delay initialized addressbook
+ """
+ if self._conversations is None:
+ _moduleLogger.info("Initializing conversations")
+ self._conversations = conversations.Conversationst(self.backend)
+ return self._conversations
--- /dev/null
+import logging
+import weakref
+
+import telepathy
+
+
+_moduleLogger = logging.getLogger("handle")
+
+
+class TheOneRingHandle(telepathy.server.Handle):
+ """
+ Instances are memoized
+ """
+
+ def __init__(self, connection, id, handleType, name):
+ telepathy.server.Handle.__init__(self, id, handleType, name)
+ self._conn = weakref.proxy(connection)
+
+ def __repr__(self):
+ return "<%s id=%u name='%s'>" % (
+ type(self).__name__, self.id, self.name
+ )
+
+ id = property(telepathy.server.Handle.get_id)
+ type = property(telepathy.server.Handle.get_type)
+ name = property(telepathy.server.Handle.get_name)
+
+
+class ConnectionHandle(TheOneRingHandle):
+
+ def __init__(self, connection, id):
+ handleType = telepathy.HANDLE_TYPE_CONTACT
+ handleName = connection.username
+ TheOneRingHandle.__init__(self, connection, id, handleType, handleName)
+
+ self.profile = connection.username
+
+
+def strip_number(prettynumber):
+ """
+ function to take a phone number and strip out all non-numeric
+ characters
+
+ >>> strip_number("+012-(345)-678-90")
+ '01234567890'
+ """
+ import re
+ uglynumber = re.sub('\D', '', prettynumber)
+ return uglynumber
+
+
+class ContactHandle(TheOneRingHandle):
+
+ def __init__(self, connection, id, contactId, phoneNumber):
+ handleType = telepathy.HANDLE_TYPE_CONTACT
+ handleName = self.to_handle_name(contactId, phoneNumber)
+ TheOneRingHandle.__init__(self, connection, id, handleType, handleName)
+
+ self._contactId = contactId
+ self._phoneNumber = phoneNumber
+
+ @staticmethod
+ def from_handle_name(handleName):
+ parts = handleName.split("#")
+ assert len(parts) == 2
+ contactId, contactNumber = parts[0:2]
+ return contactId, contactNumber
+
+ @staticmethod
+ def to_handle_name(contactId, contactNumber):
+ handleName = "#".join((contactId, strip_number(contactNumber)))
+ return handleName
+
+ @property
+ def contactID(self):
+ return self._contactId
+
+ @property
+ def contactDetails(self):
+ return self._conn.addressbook.get_contact_details(self._id)
+
+
+class ListHandle(TheOneRingHandle):
+
+ def __init__(self, connection, id, listName):
+ handleType = telepathy.HANDLE_TYPE_LIST
+ handleName = listName
+ TheOneRingHandle.__init__(self, connection, id, handleType, handleName)
+
+
+_HANDLE_TYPE_MAPPING = {
+ 'connection': ConnectionHandle,
+ 'contact': ContactHandle,
+ 'list': ListHandle,
+}
+
+
+def create_handle_factory():
+
+ cache = weakref.WeakValueDictionary()
+
+ def create_handle(connection, type, *args):
+ Handle = _HANDLE_TYPE_MAPPING[type]
+ key = Handle, connection.username, args
+ try:
+ handle = cache[key]
+ isNewHandle = False
+ except KeyError:
+ # The misnamed get_handle_id requests a new handle id
+ handle = Handle(connection, connection.get_handle_id(), *args)
+ cache[key] = handle
+ isNewHandle = True
+ connection._handles[handle.get_type(), handle.get_id()] = handle
+ handleStatus = "Is New!" if isNewHandle else "From Cache"
+ _moduleLogger.info("Created Handle: %r (%s)" % (handle, handleStatus))
+ return handle
+
+ return create_handle
+
+
+create_handle = create_handle_factory()
--- /dev/null
+import telepathy
+
+
+#class LocationMixin(telepathy.server.ConnectionInterfaceLocation):
+class LocationMixin(object):
+
+ def __init__(self):
+ #telepathy.server.ConnectionInterfaceLocation.__init__(self)
+ pass
+
+ @property
+ def session(self):
+ """
+ @abstract
+ """
+ raise NotImplementedError()
+
+ def GetLocations(self, contacts):
+ """
+ @returns {Contact: {Location Type: Location}}
+ """
+ raise NotImplementedError()
+
+ def RequestLocation(self, contact):
+ """
+ @returns {Location Type: Location}
+ """
+ raise NotImplementedError()
+
+ def SetLocation(self, location):
+ """
+ Since presence is based off of phone numbers, not allowing the client to change it
+ """
+ raise telepathy.errors.PermissionDenied()
--- /dev/null
+import logging
+
+import telepathy
+
+
+_moduleLogger = logging.getLogger("simple_presence")
+
+
+class TheOneRingPresence(object):
+ ONLINE = 'available'
+ BUSY = 'dnd'
+
+ TO_PRESENCE_TYPE = {
+ ONLINE: telepathy.constants.CONNECTION_PRESENCE_TYPE_AVAILABLE,
+ BUSY: telepathy.constants.CONNECTION_PRESENCE_TYPE_BUSY,
+ }
+
+
+class SimplePresenceMixin(telepathy.server.ConnectionInterfaceSimplePresence):
+
+ def __init__(self):
+ telepathy.server.ConnectionInterfaceSimplePresence.__init__(self)
+
+ dbus_interface = 'org.freedesktop.Telepathy.Connection.Interface.SimplePresence'
+
+ self._implement_property_get(dbus_interface, {'Statuses' : self._get_statuses})
+
+ @property
+ def session(self):
+ """
+ @abstract
+ """
+ raise NotImplementedError()
+
+ def GetPresences(self, contacts):
+ """
+ @todo Figure out how to know when its self and get whether busy or not
+
+ @return {ContactHandle: (Status, Presence Type, Message)}
+ """
+ presences = {}
+ for handleId in contacts:
+ handle = self.handle(telepathy.HANDLE_TYPE_CONTACT, handleId)
+
+ presence = TheOneRingPresence.BUSY
+ personalMessage = u""
+ presenceType = TheOneRingPresence.TO_PRESENCE_TYPE[presence]
+
+ presences[handle] = (presenceType, presence, personalMessage)
+ return presences
+
+ def SetPresence(self, status, message):
+ if message:
+ raise telepathy.errors.InvalidArgument
+
+ if status == TheOneRingPresence.ONLINE:
+ self.gvoice_backend.mark_dnd(True)
+ elif status == TheOneRingPresence.BUSY:
+ self.gvoice_backend.mark_dnd(False)
+ else:
+ raise telepathy.errors.InvalidArgument
+ _moduleLogger.info("Setting Presence to '%s'" % status)
+
+
+ def _get_statuses(self):
+ """
+ Property mapping presence statuses available to the corresponding presence types
+
+ @returns {Name: (Telepathy Type, May Set On Self, Can Have Message)}
+ """
+ return {
+ TheOneRingPresence.ONLINE: (
+ telepathy.CONNECTION_PRESENCE_TYPE_AVAILABLE,
+ True, False
+ ),
+ TheOneRingPresence.BUSY: (
+ telepathy.CONNECTION_PRESENCE_TYPE_AWAY,
+ True, False
+ ),
+ }
+
--- /dev/null
+#!/usr/bin/env python
+
+"""
+Telepathy-TheOneRing - Telepathy plugin for GoogleVoice
+Copyright (C) 2009 Ed Page eopage AT byu DOT net
+
+This library is free software; you can redistribute it and/or
+modify it under the terms of the GNU Lesser General Public
+License as published by the Free Software Foundation; either
+version 2.1 of the License, or (at your option) any later version.
+
+This library is distributed in the hope that it will be useful,
+but WITHOUT ANY WARRANTY; without even the implied warranty of
+MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+Lesser General Public License for more details.
+
+You should have received a copy of the GNU Lesser General Public
+License along with this library; if not, write to the Free Software
+Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+"""
+
+import os
+import sys
+import signal
+import logging
+import gobject
+
+import dbus.glib
+import telepathy.utils as telepathy_utils
+
+import util.linux as linux_utils
+import util.go_utils as gobject_utils
+import constants
+import connection_manager
+
+
+IDLE_TIMEOUT = 5000
+
+
+def run_theonering(persist):
+ linux_utils.set_process_name(constants.__app_name__)
+
+ try:
+ os.makedirs(constants._data_path_)
+ except OSError, e:
+ if e.errno != 17:
+ raise
+
+ @gobject_utils.async
+ def quit():
+ manager.quit()
+ mainloop.quit()
+
+ def timeout_cb():
+ if len(manager._connections) == 0:
+ logging.info('No connection received - quitting')
+ quit()
+ return False
+
+ if persist:
+ shutdown_callback = None
+ else:
+ gobject.timeout_add(IDLE_TIMEOUT, timeout_cb)
+ shutdown_callback = quit
+
+ signal.signal(signal.SIGTERM, quit)
+
+ try:
+ manager = connection_manager.TheOneRingConnectionManager(shutdown_func=shutdown_callback)
+ except dbus.exceptions.NameExistsException:
+ logging.warning('Failed to acquire bus name, connection manager already running?')
+ sys.exit(1)
+
+ mainloop = gobject.MainLoop(is_running=True)
+
+ while mainloop.is_running():
+ try:
+ mainloop.run()
+ except KeyboardInterrupt:
+ quit()
+
+
+if __name__ == '__main__':
+ telepathy_utils.debug_divert_messages(os.getenv('THEONERING_LOGFILE'))
+ logging.basicConfig(level=logging.DEBUG)
+
+ persist = 'THEONERING_PERSIST' in os.environ
+ persist = True
+ run_theonering(persist)
--- /dev/null
+#!/usr/bin/env python
--- /dev/null
+#!/usr/bin/env python
+
+"""
+@note Source http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66448
+"""
+
+import itertools
+import functools
+import datetime
+import types
+
+
+def ordered_itr(collection):
+ """
+ >>> [v for v in ordered_itr({"a": 1, "b": 2})]
+ [('a', 1), ('b', 2)]
+ >>> [v for v in ordered_itr([3, 1, 10, -20])]
+ [-20, 1, 3, 10]
+ """
+ if isinstance(collection, types.DictType):
+ keys = list(collection.iterkeys())
+ keys.sort()
+ for key in keys:
+ yield key, collection[key]
+ else:
+ values = list(collection)
+ values.sort()
+ for value in values:
+ yield value
+
+
+def itercat(*iterators):
+ """
+ Concatenate several iterators into one.
+
+ >>> [v for v in itercat([1, 2, 3], [4, 1, 3])]
+ [1, 2, 3, 4, 1, 3]
+ """
+ for i in iterators:
+ for x in i:
+ yield x
+
+
+def iterwhile(func, iterator):
+ """
+ Iterate for as long as func(value) returns true.
+ >>> through = lambda b: b
+ >>> [v for v in iterwhile(through, [True, True, False])]
+ [True, True]
+ """
+ iterator = iter(iterator)
+ while 1:
+ next = iterator.next()
+ if not func(next):
+ raise StopIteration
+ yield next
+
+
+def iterfirst(iterator, count=1):
+ """
+ Iterate through 'count' first values.
+
+ >>> [v for v in iterfirst([1, 2, 3, 4, 5], 3)]
+ [1, 2, 3]
+ """
+ iterator = iter(iterator)
+ for i in xrange(count):
+ yield iterator.next()
+
+
+def iterstep(iterator, n):
+ """
+ Iterate every nth value.
+
+ >>> [v for v in iterstep([1, 2, 3, 4, 5], 1)]
+ [1, 2, 3, 4, 5]
+ >>> [v for v in iterstep([1, 2, 3, 4, 5], 2)]
+ [1, 3, 5]
+ >>> [v for v in iterstep([1, 2, 3, 4, 5], 3)]
+ [1, 4]
+ """
+ iterator = iter(iterator)
+ while True:
+ yield iterator.next()
+ # skip n-1 values
+ for dummy in xrange(n-1):
+ iterator.next()
+
+
+def itergroup(iterator, count, padValue = None):
+ """
+ Iterate in groups of 'count' values. If there
+ aren't enough values, the last result is padded with
+ None.
+
+ >>> for val in itergroup([1, 2, 3, 4, 5, 6], 3):
+ ... print tuple(val)
+ (1, 2, 3)
+ (4, 5, 6)
+ >>> for val in itergroup([1, 2, 3, 4, 5, 6], 3):
+ ... print list(val)
+ [1, 2, 3]
+ [4, 5, 6]
+ >>> for val in itergroup([1, 2, 3, 4, 5, 6, 7], 3):
+ ... print tuple(val)
+ (1, 2, 3)
+ (4, 5, 6)
+ (7, None, None)
+ >>> for val in itergroup("123456", 3):
+ ... print tuple(val)
+ ('1', '2', '3')
+ ('4', '5', '6')
+ >>> for val in itergroup("123456", 3):
+ ... print repr("".join(val))
+ '123'
+ '456'
+ """
+ paddedIterator = itertools.chain(iterator, itertools.repeat(padValue, count-1))
+ nIterators = (paddedIterator, ) * count
+ return itertools.izip(*nIterators)
+
+
+def xzip(*iterators):
+ """Iterative version of builtin 'zip'."""
+ iterators = itertools.imap(iter, iterators)
+ while 1:
+ yield tuple([x.next() for x in iterators])
+
+
+def xmap(func, *iterators):
+ """Iterative version of builtin 'map'."""
+ iterators = itertools.imap(iter, iterators)
+ values_left = [1]
+
+ def values():
+ # Emulate map behaviour, i.e. shorter
+ # sequences are padded with None when
+ # they run out of values.
+ values_left[0] = 0
+ for i in range(len(iterators)):
+ iterator = iterators[i]
+ if iterator is None:
+ yield None
+ else:
+ try:
+ yield iterator.next()
+ values_left[0] = 1
+ except StopIteration:
+ iterators[i] = None
+ yield None
+ while 1:
+ args = tuple(values())
+ if not values_left[0]:
+ raise StopIteration
+ yield func(*args)
+
+
+def xfilter(func, iterator):
+ """Iterative version of builtin 'filter'."""
+ iterator = iter(iterator)
+ while 1:
+ next = iterator.next()
+ if func(next):
+ yield next
+
+
+def xreduce(func, iterator, default=None):
+ """Iterative version of builtin 'reduce'."""
+ iterator = iter(iterator)
+ try:
+ prev = iterator.next()
+ except StopIteration:
+ return default
+ single = 1
+ for next in iterator:
+ single = 0
+ prev = func(prev, next)
+ if single:
+ return func(prev, default)
+ return prev
+
+
+def daterange(begin, end, delta = datetime.timedelta(1)):
+ """
+ Form a range of dates and iterate over them.
+
+ Arguments:
+ begin -- a date (or datetime) object; the beginning of the range.
+ end -- a date (or datetime) object; the end of the range.
+ delta -- (optional) a datetime.timedelta object; how much to step each iteration.
+ Default step is 1 day.
+
+ Usage:
+ """
+ if not isinstance(delta, datetime.timedelta):
+ delta = datetime.timedelta(delta)
+
+ ZERO = datetime.timedelta(0)
+
+ if begin < end:
+ if delta <= ZERO:
+ raise StopIteration
+ test = end.__gt__
+ else:
+ if delta >= ZERO:
+ raise StopIteration
+ test = end.__lt__
+
+ while test(begin):
+ yield begin
+ begin += delta
+
+
+class LazyList(object):
+ """
+ A Sequence whose values are computed lazily by an iterator.
+
+ Module for the creation and use of iterator-based lazy lists.
+ this module defines a class LazyList which can be used to represent sequences
+ of values generated lazily. One can also create recursively defined lazy lists
+ that generate their values based on ones previously generated.
+
+ Backport to python 2.5 by Michael Pust
+ """
+
+ __author__ = 'Dan Spitz'
+
+ def __init__(self, iterable):
+ self._exhausted = False
+ self._iterator = iter(iterable)
+ self._data = []
+
+ def __len__(self):
+ """Get the length of a LazyList's computed data."""
+ return len(self._data)
+
+ def __getitem__(self, i):
+ """Get an item from a LazyList.
+ i should be a positive integer or a slice object."""
+ if isinstance(i, int):
+ #index has not yet been yielded by iterator (or iterator exhausted
+ #before reaching that index)
+ if i >= len(self):
+ self.exhaust(i)
+ elif i < 0:
+ raise ValueError('cannot index LazyList with negative number')
+ return self._data[i]
+
+ #LazyList slices are iterators over a portion of the list.
+ elif isinstance(i, slice):
+ start, stop, step = i.start, i.stop, i.step
+ if any(x is not None and x < 0 for x in (start, stop, step)):
+ raise ValueError('cannot index or step through a LazyList with'
+ 'a negative number')
+ #set start and step to their integer defaults if they are None.
+ if start is None:
+ start = 0
+ if step is None:
+ step = 1
+
+ def LazyListIterator():
+ count = start
+ predicate = (
+ (lambda: True)
+ if stop is None
+ else (lambda: count < stop)
+ )
+ while predicate():
+ try:
+ yield self[count]
+ #slices can go out of actual index range without raising an
+ #error
+ except IndexError:
+ break
+ count += step
+ return LazyListIterator()
+
+ raise TypeError('i must be an integer or slice')
+
+ def __iter__(self):
+ """return an iterator over each value in the sequence,
+ whether it has been computed yet or not."""
+ return self[:]
+
+ def computed(self):
+ """Return an iterator over the values in a LazyList that have
+ already been computed."""
+ return self[:len(self)]
+
+ def exhaust(self, index = None):
+ """Exhaust the iterator generating this LazyList's values.
+ if index is None, this will exhaust the iterator completely.
+ Otherwise, it will iterate over the iterator until either the list
+ has a value for index or the iterator is exhausted.
+ """
+ if self._exhausted:
+ return
+ if index is None:
+ ind_range = itertools.count(len(self))
+ else:
+ ind_range = range(len(self), index + 1)
+
+ for ind in ind_range:
+ try:
+ self._data.append(self._iterator.next())
+ except StopIteration: #iterator is fully exhausted
+ self._exhausted = True
+ break
+
+
+class RecursiveLazyList(LazyList):
+
+ def __init__(self, prod, *args, **kwds):
+ super(RecursiveLazyList, self).__init__(prod(self, *args, **kwds))
+
+
+class RecursiveLazyListFactory:
+
+ def __init__(self, producer):
+ self._gen = producer
+
+ def __call__(self, *a, **kw):
+ return RecursiveLazyList(self._gen, *a, **kw)
+
+
+def lazylist(gen):
+ """
+ Decorator for creating a RecursiveLazyList subclass.
+ This should decorate a generator function taking the LazyList object as its
+ first argument which yields the contents of the list in order.
+
+ >>> #fibonnacci sequence in a lazy list.
+ >>> @lazylist
+ ... def fibgen(lst):
+ ... yield 0
+ ... yield 1
+ ... for a, b in itertools.izip(lst, lst[1:]):
+ ... yield a + b
+ ...
+ >>> #now fibs can be indexed or iterated over as if it were an infinitely long list containing the fibonnaci sequence
+ >>> fibs = fibgen()
+ >>>
+ >>> #prime numbers in a lazy list.
+ >>> @lazylist
+ ... def primegen(lst):
+ ... yield 2
+ ... for candidate in itertools.count(3): #start at next number after 2
+ ... #if candidate is not divisible by any smaller prime numbers,
+ ... #it is a prime.
+ ... if all(candidate % p for p in lst.computed()):
+ ... yield candidate
+ ...
+ >>> #same for primes- treat it like an infinitely long list containing all prime numbers.
+ >>> primes = primegen()
+ >>> print fibs[0], fibs[1], fibs[2], primes[0], primes[1], primes[2]
+ 0 1 1 2 3 5
+ >>> print list(fibs[:10]), list(primes[:10])
+ [0, 1, 1, 2, 3, 5, 8, 13, 21, 34] [2, 3, 5, 7, 11, 13, 17, 19, 23, 29]
+ """
+ return RecursiveLazyListFactory(gen)
+
+
+def map_func(f):
+ """
+ >>> import misc
+ >>> misc.validate_decorator(map_func)
+ """
+
+ @functools.wraps(f)
+ def wrapper(*args):
+ result = itertools.imap(f, args)
+ return result
+ return wrapper
+
+
+def reduce_func(function):
+ """
+ >>> import misc
+ >>> misc.validate_decorator(reduce_func(lambda x: x))
+ """
+
+ def decorator(f):
+
+ @functools.wraps(f)
+ def wrapper(*args):
+ result = reduce(function, f(args))
+ return result
+ return wrapper
+ return decorator
+
+
+def any_(iterable):
+ """
+ @note Python Version <2.5
+
+ >>> any_([True, True])
+ True
+ >>> any_([True, False])
+ True
+ >>> any_([False, False])
+ False
+ """
+
+ for element in iterable:
+ if element:
+ return True
+ return False
+
+
+def all_(iterable):
+ """
+ @note Python Version <2.5
+
+ >>> all_([True, True])
+ True
+ >>> all_([True, False])
+ False
+ >>> all_([False, False])
+ False
+ """
+
+ for element in iterable:
+ if not element:
+ return False
+ return True
+
+
+def for_every(pred, seq):
+ """
+ for_every takes a one argument predicate function and a sequence.
+ @param pred The predicate function should return true or false.
+ @returns true if every element in seq returns true for predicate, else returns false.
+
+ >>> for_every (lambda c: c > 5,(6,7,8,9))
+ True
+
+ @author Source:http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52907
+ """
+
+ for i in seq:
+ if not pred(i):
+ return False
+ return True
+
+
+def there_exists(pred, seq):
+ """
+ there_exists takes a one argument predicate function and a sequence.
+ @param pred The predicate function should return true or false.
+ @returns true if any element in seq returns true for predicate, else returns false.
+
+ >>> there_exists (lambda c: c > 5,(6,7,8,9))
+ True
+
+ @author Source:http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52907
+ """
+
+ for i in seq:
+ if pred(i):
+ return True
+ return False
+
+
+def func_repeat(quantity, func, *args, **kwd):
+ """
+ Meant to be in connection with "reduce"
+ """
+ for i in xrange(quantity):
+ yield func(*args, **kwd)
+
+
+def function_map(preds, item):
+ """
+ Meant to be in connection with "reduce"
+ """
+ results = (pred(item) for pred in preds)
+
+ return results
+
+
+def functional_if(combiner, preds, item):
+ """
+ Combines the result of a list of predicates applied to item according to combiner
+
+ @see any, every for example combiners
+ """
+ pass_bool = lambda b: b
+
+ bool_results = function_map(preds, item)
+ return combiner(pass_bool, bool_results)
+
+
+def pushback_itr(itr):
+ """
+ >>> list(pushback_itr(xrange(5)))
+ [0, 1, 2, 3, 4]
+ >>>
+ >>> first = True
+ >>> itr = pushback_itr(xrange(5))
+ >>> for i in itr:
+ ... print i
+ ... if first and i == 2:
+ ... first = False
+ ... print itr.send(i)
+ 0
+ 1
+ 2
+ None
+ 2
+ 3
+ 4
+ >>>
+ >>> first = True
+ >>> itr = pushback_itr(xrange(5))
+ >>> for i in itr:
+ ... print i
+ ... if first and i == 2:
+ ... first = False
+ ... print itr.send(i)
+ ... print itr.send(i)
+ 0
+ 1
+ 2
+ None
+ None
+ 2
+ 2
+ 3
+ 4
+ >>>
+ >>> itr = pushback_itr(xrange(5))
+ >>> print itr.next()
+ 0
+ >>> print itr.next()
+ 1
+ >>> print itr.send(10)
+ None
+ >>> print itr.next()
+ 10
+ >>> print itr.next()
+ 2
+ >>> print itr.send(20)
+ None
+ >>> print itr.send(30)
+ None
+ >>> print itr.send(40)
+ None
+ >>> print itr.next()
+ 40
+ >>> print itr.next()
+ 30
+ >>> print itr.send(50)
+ None
+ >>> print itr.next()
+ 50
+ >>> print itr.next()
+ 20
+ >>> print itr.next()
+ 3
+ >>> print itr.next()
+ 4
+ """
+ for item in itr:
+ maybePushedBack = yield item
+ queue = []
+ while queue or maybePushedBack is not None:
+ if maybePushedBack is not None:
+ queue.append(maybePushedBack)
+ maybePushedBack = yield None
+ else:
+ item = queue.pop()
+ maybePushedBack = yield item
+
+
+if __name__ == "__main__":
+ import doctest
+ print doctest.testmod()
--- /dev/null
+#!/usr/bin/env python
+
+from __future__ import with_statement
+
+import os
+import errno
+import time
+import functools
+import contextlib
+
+
+def synchronized(lock):
+ """
+ Synchronization decorator.
+
+ >>> import misc
+ >>> misc.validate_decorator(synchronized(object()))
+ """
+
+ def wrap(f):
+
+ @functools.wraps(f)
+ def newFunction(*args, **kw):
+ lock.acquire()
+ try:
+ return f(*args, **kw)
+ finally:
+ lock.release()
+ return newFunction
+ return wrap
+
+
+@contextlib.contextmanager
+def qlock(queue, gblock = True, gtimeout = None, pblock = True, ptimeout = None):
+ """
+ Locking with a queue, good for when you want to lock an item passed around
+
+ >>> import Queue
+ >>> item = 5
+ >>> lock = Queue.Queue()
+ >>> lock.put(item)
+ >>> with qlock(lock) as i:
+ ... print i
+ 5
+ """
+ item = queue.get(gblock, gtimeout)
+ try:
+ yield item
+ finally:
+ queue.put(item, pblock, ptimeout)
+
+
+@contextlib.contextmanager
+def flock(path, timeout=-1):
+ WAIT_FOREVER = -1
+ DELAY = 0.1
+ timeSpent = 0
+
+ acquired = False
+
+ while timeSpent <= timeout or timeout == WAIT_FOREVER:
+ try:
+ fd = os.open(path, os.O_CREAT | os.O_EXCL | os.O_RDWR)
+ acquired = True
+ break
+ except OSError, e:
+ if e.errno != errno.EEXIST:
+ raise
+ time.sleep(DELAY)
+ timeSpent += DELAY
+
+ assert acquired, "Failed to grab file-lock %s within timeout %d" % (path, timeout)
+
+ try:
+ yield fd
+ finally:
+ os.unlink(path)
--- /dev/null
+#!/usr/bin/env python\r
+\r
+"""\r
+Uses for generators\r
+* Pull pipelining (iterators)\r
+* Push pipelining (coroutines)\r
+* State machines (coroutines)\r
+* "Cooperative multitasking" (coroutines)\r
+* Algorithm -> Object transform for cohesiveness (for example context managers) (coroutines)\r
+\r
+Design considerations\r
+* When should a stage pass on exceptions or have it thrown within it?\r
+* When should a stage pass on GeneratorExits?\r
+* Is there a way to either turn a push generator into a iterator or to use\r
+ comprehensions syntax for push generators (I doubt it)\r
+* When should the stage try and send data in both directions\r
+* Since pull generators (generators), push generators (coroutines), subroutines, and coroutines are all coroutines, maybe we should rename the push generators to not confuse them, like signals/slots? and then refer to two-way generators as coroutines\r
+** If so, make s* and co* implementation of functions\r
+"""\r
+\r
+import threading\r
+import Queue\r
+import pickle\r
+import functools\r
+import itertools\r
+import xml.sax\r
+import xml.parsers.expat\r
+\r
+\r
+def autostart(func):\r
+ """\r
+ >>> @autostart\r
+ ... def grep_sink(pattern):\r
+ ... print "Looking for %s" % pattern\r
+ ... while True:\r
+ ... line = yield\r
+ ... if pattern in line:\r
+ ... print line,\r
+ >>> g = grep_sink("python")\r
+ Looking for python\r
+ >>> g.send("Yeah but no but yeah but no")\r
+ >>> g.send("A series of tubes")\r
+ >>> g.send("python generators rock!")\r
+ python generators rock!\r
+ >>> g.close()\r
+ """\r
+\r
+ @functools.wraps(func)\r
+ def start(*args, **kwargs):\r
+ cr = func(*args, **kwargs)\r
+ cr.next()\r
+ return cr\r
+\r
+ return start\r
+\r
+\r
+@autostart\r
+def printer_sink(format = "%s"):\r
+ """\r
+ >>> pr = printer_sink("%r")\r
+ >>> pr.send("Hello")\r
+ 'Hello'\r
+ >>> pr.send("5")\r
+ '5'\r
+ >>> pr.send(5)\r
+ 5\r
+ >>> p = printer_sink()\r
+ >>> p.send("Hello")\r
+ Hello\r
+ >>> p.send("World")\r
+ World\r
+ >>> # p.throw(RuntimeError, "Goodbye")\r
+ >>> # p.send("Meh")\r
+ >>> # p.close()\r
+ """\r
+ while True:\r
+ item = yield\r
+ print format % (item, )\r
+\r
+\r
+@autostart\r
+def null_sink():\r
+ """\r
+ Good for uses like with cochain to pick up any slack\r
+ """\r
+ while True:\r
+ item = yield\r
+\r
+\r
+def itr_source(itr, target):\r
+ """\r
+ >>> itr_source(xrange(2), printer_sink())\r
+ 0\r
+ 1\r
+ """\r
+ for item in itr:\r
+ target.send(item)\r
+\r
+\r
+@autostart\r
+def cofilter(predicate, target):\r
+ """\r
+ >>> p = printer_sink()\r
+ >>> cf = cofilter(None, p)\r
+ >>> cf.send("")\r
+ >>> cf.send("Hello")\r
+ Hello\r
+ >>> cf.send([])\r
+ >>> cf.send([1, 2])\r
+ [1, 2]\r
+ >>> cf.send(False)\r
+ >>> cf.send(True)\r
+ True\r
+ >>> cf.send(0)\r
+ >>> cf.send(1)\r
+ 1\r
+ >>> # cf.throw(RuntimeError, "Goodbye")\r
+ >>> # cf.send(False)\r
+ >>> # cf.send(True)\r
+ >>> # cf.close()\r
+ """\r
+ if predicate is None:\r
+ predicate = bool\r
+\r
+ while True:\r
+ try:\r
+ item = yield\r
+ if predicate(item):\r
+ target.send(item)\r
+ except StandardError, e:\r
+ target.throw(e.__class__, e.message)\r
+\r
+\r
+@autostart\r
+def comap(function, target):\r
+ """\r
+ >>> p = printer_sink()\r
+ >>> cm = comap(lambda x: x+1, p)\r
+ >>> cm.send(0)\r
+ 1\r
+ >>> cm.send(1.0)\r
+ 2.0\r
+ >>> cm.send(-2)\r
+ -1\r
+ >>> # cm.throw(RuntimeError, "Goodbye")\r
+ >>> # cm.send(0)\r
+ >>> # cm.send(1.0)\r
+ >>> # cm.close()\r
+ """\r
+ while True:\r
+ try:\r
+ item = yield\r
+ mappedItem = function(item)\r
+ target.send(mappedItem)\r
+ except StandardError, e:\r
+ target.throw(e.__class__, e.message)\r
+\r
+\r
+def func_sink(function):\r
+ return comap(function, null_sink())\r
+\r
+\r
+def expand_positional(function):\r
+\r
+ @functools.wraps(function)\r
+ def expander(item):\r
+ return function(*item)\r
+\r
+ return expander\r
+\r
+\r
+@autostart\r
+def append_sink(l):\r
+ """\r
+ >>> l = []\r
+ >>> apps = append_sink(l)\r
+ >>> apps.send(1)\r
+ >>> apps.send(2)\r
+ >>> apps.send(3)\r
+ >>> print l\r
+ [1, 2, 3]\r
+ """\r
+ while True:\r
+ item = yield\r
+ l.append(item)\r
+\r
+\r
+@autostart\r
+def last_n_sink(l, n = 1):\r
+ """\r
+ >>> l = []\r
+ >>> lns = last_n_sink(l)\r
+ >>> lns.send(1)\r
+ >>> lns.send(2)\r
+ >>> lns.send(3)\r
+ >>> print l\r
+ [3]\r
+ """\r
+ del l[:]\r
+ while True:\r
+ item = yield\r
+ extraCount = len(l) - n + 1\r
+ if 0 < extraCount:\r
+ del l[0:extraCount]\r
+ l.append(item)\r
+\r
+\r
+@autostart\r
+def coreduce(target, function, initializer = None):\r
+ """\r
+ >>> reduceResult = []\r
+ >>> lns = last_n_sink(reduceResult)\r
+ >>> cr = coreduce(lns, lambda x, y: x + y, 0)\r
+ >>> cr.send(1)\r
+ >>> cr.send(2)\r
+ >>> cr.send(3)\r
+ >>> print reduceResult\r
+ [6]\r
+ >>> cr = coreduce(lns, lambda x, y: x + y)\r
+ >>> cr.send(1)\r
+ >>> cr.send(2)\r
+ >>> cr.send(3)\r
+ >>> print reduceResult\r
+ [6]\r
+ """\r
+ isFirst = True\r
+ cumulativeRef = initializer\r
+ while True:\r
+ item = yield\r
+ if isFirst and initializer is None:\r
+ cumulativeRef = item\r
+ else:\r
+ cumulativeRef = function(cumulativeRef, item)\r
+ target.send(cumulativeRef)\r
+ isFirst = False\r
+\r
+\r
+@autostart\r
+def cotee(targets):\r
+ """\r
+ Takes a sequence of coroutines and sends the received items to all of them\r
+\r
+ >>> ct = cotee((printer_sink("1 %s"), printer_sink("2 %s")))\r
+ >>> ct.send("Hello")\r
+ 1 Hello\r
+ 2 Hello\r
+ >>> ct.send("World")\r
+ 1 World\r
+ 2 World\r
+ >>> # ct.throw(RuntimeError, "Goodbye")\r
+ >>> # ct.send("Meh")\r
+ >>> # ct.close()\r
+ """\r
+ while True:\r
+ try:\r
+ item = yield\r
+ for target in targets:\r
+ target.send(item)\r
+ except StandardError, e:\r
+ for target in targets:\r
+ target.throw(e.__class__, e.message)\r
+\r
+\r
+class CoTee(object):\r
+ """\r
+ >>> ct = CoTee()\r
+ >>> ct.register_sink(printer_sink("1 %s"))\r
+ >>> ct.register_sink(printer_sink("2 %s"))\r
+ >>> ct.stage.send("Hello")\r
+ 1 Hello\r
+ 2 Hello\r
+ >>> ct.stage.send("World")\r
+ 1 World\r
+ 2 World\r
+ >>> ct.register_sink(printer_sink("3 %s"))\r
+ >>> ct.stage.send("Foo")\r
+ 1 Foo\r
+ 2 Foo\r
+ 3 Foo\r
+ >>> # ct.stage.throw(RuntimeError, "Goodbye")\r
+ >>> # ct.stage.send("Meh")\r
+ >>> # ct.stage.close()\r
+ """\r
+\r
+ def __init__(self):\r
+ self.stage = self._stage()\r
+ self._targets = []\r
+\r
+ def register_sink(self, sink):\r
+ self._targets.append(sink)\r
+\r
+ def unregister_sink(self, sink):\r
+ self._targets.remove(sink)\r
+\r
+ def restart(self):\r
+ self.stage = self._stage()\r
+\r
+ @autostart\r
+ def _stage(self):\r
+ while True:\r
+ try:\r
+ item = yield\r
+ for target in self._targets:\r
+ target.send(item)\r
+ except StandardError, e:\r
+ for target in self._targets:\r
+ target.throw(e.__class__, e.message)\r
+\r
+\r
+def _flush_queue(queue):\r
+ while not queue.empty():\r
+ yield queue.get()\r
+\r
+\r
+@autostart\r
+def cocount(target, start = 0):\r
+ """\r
+ >>> cc = cocount(printer_sink("%s"))\r
+ >>> cc.send("a")\r
+ 0\r
+ >>> cc.send(None)\r
+ 1\r
+ >>> cc.send([])\r
+ 2\r
+ >>> cc.send(0)\r
+ 3\r
+ """\r
+ for i in itertools.count(start):\r
+ item = yield\r
+ target.send(i)\r
+\r
+\r
+@autostart\r
+def coenumerate(target, start = 0):\r
+ """\r
+ >>> ce = coenumerate(printer_sink("%r"))\r
+ >>> ce.send("a")\r
+ (0, 'a')\r
+ >>> ce.send(None)\r
+ (1, None)\r
+ >>> ce.send([])\r
+ (2, [])\r
+ >>> ce.send(0)\r
+ (3, 0)\r
+ """\r
+ for i in itertools.count(start):\r
+ item = yield\r
+ decoratedItem = i, item\r
+ target.send(decoratedItem)\r
+\r
+\r
+@autostart\r
+def corepeat(target, elem):\r
+ """\r
+ >>> cr = corepeat(printer_sink("%s"), "Hello World")\r
+ >>> cr.send("a")\r
+ Hello World\r
+ >>> cr.send(None)\r
+ Hello World\r
+ >>> cr.send([])\r
+ Hello World\r
+ >>> cr.send(0)\r
+ Hello World\r
+ """\r
+ while True:\r
+ item = yield\r
+ target.send(elem)\r
+\r
+\r
+@autostart\r
+def cointercept(target, elems):\r
+ """\r
+ >>> cr = cointercept(printer_sink("%s"), [1, 2, 3, 4])\r
+ >>> cr.send("a")\r
+ 1\r
+ >>> cr.send(None)\r
+ 2\r
+ >>> cr.send([])\r
+ 3\r
+ >>> cr.send(0)\r
+ 4\r
+ >>> cr.send("Bye")\r
+ Traceback (most recent call last):\r
+ File "/usr/lib/python2.5/doctest.py", line 1228, in __run\r
+ compileflags, 1) in test.globs\r
+ File "<doctest __main__.cointercept[5]>", line 1, in <module>\r
+ cr.send("Bye")\r
+ StopIteration\r
+ """\r
+ item = yield\r
+ for elem in elems:\r
+ target.send(elem)\r
+ item = yield\r
+\r
+\r
+@autostart\r
+def codropwhile(target, pred):\r
+ """\r
+ >>> cdw = codropwhile(printer_sink("%s"), lambda x: x)\r
+ >>> cdw.send([0, 1, 2])\r
+ >>> cdw.send(1)\r
+ >>> cdw.send(True)\r
+ >>> cdw.send(False)\r
+ >>> cdw.send([0, 1, 2])\r
+ [0, 1, 2]\r
+ >>> cdw.send(1)\r
+ 1\r
+ >>> cdw.send(True)\r
+ True\r
+ """\r
+ while True:\r
+ item = yield\r
+ if not pred(item):\r
+ break\r
+\r
+ while True:\r
+ item = yield\r
+ target.send(item)\r
+\r
+\r
+@autostart\r
+def cotakewhile(target, pred):\r
+ """\r
+ >>> ctw = cotakewhile(printer_sink("%s"), lambda x: x)\r
+ >>> ctw.send([0, 1, 2])\r
+ [0, 1, 2]\r
+ >>> ctw.send(1)\r
+ 1\r
+ >>> ctw.send(True)\r
+ True\r
+ >>> ctw.send(False)\r
+ >>> ctw.send([0, 1, 2])\r
+ >>> ctw.send(1)\r
+ >>> ctw.send(True)\r
+ """\r
+ while True:\r
+ item = yield\r
+ if not pred(item):\r
+ break\r
+ target.send(item)\r
+\r
+ while True:\r
+ item = yield\r
+\r
+\r
+@autostart\r
+def coslice(target, lower, upper):\r
+ """\r
+ >>> cs = coslice(printer_sink("%r"), 3, 5)\r
+ >>> cs.send("0")\r
+ >>> cs.send("1")\r
+ >>> cs.send("2")\r
+ >>> cs.send("3")\r
+ '3'\r
+ >>> cs.send("4")\r
+ '4'\r
+ >>> cs.send("5")\r
+ >>> cs.send("6")\r
+ """\r
+ for i in xrange(lower):\r
+ item = yield\r
+ for i in xrange(upper - lower):\r
+ item = yield\r
+ target.send(item)\r
+ while True:\r
+ item = yield\r
+\r
+\r
+@autostart\r
+def cochain(targets):\r
+ """\r
+ >>> cr = cointercept(printer_sink("good %s"), [1, 2, 3, 4])\r
+ >>> cc = cochain([cr, printer_sink("end %s")])\r
+ >>> cc.send("a")\r
+ good 1\r
+ >>> cc.send(None)\r
+ good 2\r
+ >>> cc.send([])\r
+ good 3\r
+ >>> cc.send(0)\r
+ good 4\r
+ >>> cc.send("Bye")\r
+ end Bye\r
+ """\r
+ behind = []\r
+ for target in targets:\r
+ try:\r
+ while behind:\r
+ item = behind.pop()\r
+ target.send(item)\r
+ while True:\r
+ item = yield\r
+ target.send(item)\r
+ except StopIteration:\r
+ behind.append(item)\r
+\r
+\r
+@autostart\r
+def queue_sink(queue):\r
+ """\r
+ >>> q = Queue.Queue()\r
+ >>> qs = queue_sink(q)\r
+ >>> qs.send("Hello")\r
+ >>> qs.send("World")\r
+ >>> qs.throw(RuntimeError, "Goodbye")\r
+ >>> qs.send("Meh")\r
+ >>> qs.close()\r
+ >>> print [i for i in _flush_queue(q)]\r
+ [(None, 'Hello'), (None, 'World'), (<type 'exceptions.RuntimeError'>, 'Goodbye'), (None, 'Meh'), (<type 'exceptions.GeneratorExit'>, None)]\r
+ """\r
+ while True:\r
+ try:\r
+ item = yield\r
+ queue.put((None, item))\r
+ except StandardError, e:\r
+ queue.put((e.__class__, e.message))\r
+ except GeneratorExit:\r
+ queue.put((GeneratorExit, None))\r
+ raise\r
+\r
+\r
+def decode_item(item, target):\r
+ if item[0] is None:\r
+ target.send(item[1])\r
+ return False\r
+ elif item[0] is GeneratorExit:\r
+ target.close()\r
+ return True\r
+ else:\r
+ target.throw(item[0], item[1])\r
+ return False\r
+\r
+\r
+def queue_source(queue, target):\r
+ """\r
+ >>> q = Queue.Queue()\r
+ >>> for i in [\r
+ ... (None, 'Hello'),\r
+ ... (None, 'World'),\r
+ ... (GeneratorExit, None),\r
+ ... ]:\r
+ ... q.put(i)\r
+ >>> qs = queue_source(q, printer_sink())\r
+ Hello\r
+ World\r
+ """\r
+ isDone = False\r
+ while not isDone:\r
+ item = queue.get()\r
+ isDone = decode_item(item, target)\r
+\r
+\r
+def threaded_stage(target, thread_factory = threading.Thread):\r
+ messages = Queue.Queue()\r
+\r
+ run_source = functools.partial(queue_source, messages, target)\r
+ thread_factory(target=run_source).start()\r
+\r
+ # Sink running in current thread\r
+ return functools.partial(queue_sink, messages)\r
+\r
+\r
+@autostart\r
+def pickle_sink(f):\r
+ while True:\r
+ try:\r
+ item = yield\r
+ pickle.dump((None, item), f)\r
+ except StandardError, e:\r
+ pickle.dump((e.__class__, e.message), f)\r
+ except GeneratorExit:\r
+ pickle.dump((GeneratorExit, ), f)\r
+ raise\r
+ except StopIteration:\r
+ f.close()\r
+ return\r
+\r
+\r
+def pickle_source(f, target):\r
+ try:\r
+ isDone = False\r
+ while not isDone:\r
+ item = pickle.load(f)\r
+ isDone = decode_item(item, target)\r
+ except EOFError:\r
+ target.close()\r
+\r
+\r
+class EventHandler(object, xml.sax.ContentHandler):\r
+\r
+ START = "start"\r
+ TEXT = "text"\r
+ END = "end"\r
+\r
+ def __init__(self, target):\r
+ object.__init__(self)\r
+ xml.sax.ContentHandler.__init__(self)\r
+ self._target = target\r
+\r
+ def startElement(self, name, attrs):\r
+ self._target.send((self.START, (name, attrs._attrs)))\r
+\r
+ def characters(self, text):\r
+ self._target.send((self.TEXT, text))\r
+\r
+ def endElement(self, name):\r
+ self._target.send((self.END, name))\r
+\r
+\r
+def expat_parse(f, target):\r
+ parser = xml.parsers.expat.ParserCreate()\r
+ parser.buffer_size = 65536\r
+ parser.buffer_text = True\r
+ parser.returns_unicode = False\r
+ parser.StartElementHandler = lambda name, attrs: target.send(('start', (name, attrs)))\r
+ parser.EndElementHandler = lambda name: target.send(('end', name))\r
+ parser.CharacterDataHandler = lambda data: target.send(('text', data))\r
+ parser.ParseFile(f)\r
+\r
+\r
+if __name__ == "__main__":\r
+ import doctest\r
+ doctest.testmod()\r
--- /dev/null
+#!/usr/bin/env python
+
+from __future__ import with_statement
+
+import time
+import functools
+
+import gobject
+
+
+def async(func):
+ """
+ Make a function mainloop friendly. the function will be called at the
+ next mainloop idle state.
+
+ >>> import misc
+ >>> misc.validate_decorator(async)
+ """
+
+ @functools.wraps(func)
+ def new_function(*args, **kwargs):
+
+ def async_function():
+ func(*args, **kwargs)
+ return False
+
+ gobject.idle_add(async_function)
+
+ return new_function
+
+
+def throttled(minDelay, queue):
+ """
+ Throttle the calls to a function by queueing all the calls that happen
+ before the minimum delay
+
+ >>> import misc
+ >>> import Queue
+ >>> misc.validate_decorator(throttled(0, Queue.Queue()))
+ """
+
+ def actual_decorator(func):
+
+ lastCallTime = [None]
+
+ def process_queue():
+ if 0 < len(queue):
+ func, args, kwargs = queue.pop(0)
+ lastCallTime[0] = time.time() * 1000
+ func(*args, **kwargs)
+ return False
+
+ @functools.wraps(func)
+ def new_function(*args, **kwargs):
+ now = time.time() * 1000
+ if (
+ lastCallTime[0] is None or
+ (now - lastCallTime >= minDelay)
+ ):
+ lastCallTime[0] = now
+ func(*args, **kwargs)
+ else:
+ queue.append((func, args, kwargs))
+ lastCallDelta = now - lastCallTime[0]
+ processQueueTimeout = int(minDelay * len(queue) - lastCallDelta)
+ gobject.timeout_add(processQueueTimeout, process_queue)
+
+ return new_function
+
+ return actual_decorator
--- /dev/null
+#!/usr/bin/env python
+
+
+from __future__ import with_statement
+
+import os
+import pickle
+import contextlib
+import itertools
+import functools
+
+
+@contextlib.contextmanager
+def change_directory(directory):
+ previousDirectory = os.getcwd()
+ os.chdir(directory)
+ currentDirectory = os.getcwd()
+
+ try:
+ yield previousDirectory, currentDirectory
+ finally:
+ os.chdir(previousDirectory)
+
+
+@contextlib.contextmanager
+def pickled(filename):
+ """
+ Here is an example usage:
+ with pickled("foo.db") as p:
+ p("users", list).append(["srid", "passwd", 23])
+ """
+
+ if os.path.isfile(filename):
+ data = pickle.load(open(filename))
+ else:
+ data = {}
+
+ def getter(item, factory):
+ if item in data:
+ return data[item]
+ else:
+ data[item] = factory()
+ return data[item]
+
+ yield getter
+
+ pickle.dump(data, open(filename, "w"))
+
+
+@contextlib.contextmanager
+def redirect(object_, attr, value):
+ """
+ >>> import sys
+ ... with redirect(sys, 'stdout', open('stdout', 'w')):
+ ... print "hello"
+ ...
+ >>> print "we're back"
+ we're back
+ """
+ orig = getattr(object_, attr)
+ setattr(object_, attr, value)
+ try:
+ yield
+ finally:
+ setattr(object_, attr, orig)
+
+
+def pathsplit(path):
+ """
+ >>> pathsplit("/a/b/c")
+ ['', 'a', 'b', 'c']
+ >>> pathsplit("./plugins/builtins.ini")
+ ['.', 'plugins', 'builtins.ini']
+ """
+ pathParts = path.split(os.path.sep)
+ return pathParts
+
+
+def commonpath(l1, l2, common=None):
+ """
+ >>> commonpath(pathsplit('/a/b/c/d'), pathsplit('/a/b/c1/d1'))
+ (['', 'a', 'b'], ['c', 'd'], ['c1', 'd1'])
+ >>> commonpath(pathsplit("./plugins/"), pathsplit("./plugins/builtins.ini"))
+ (['.', 'plugins'], [''], ['builtins.ini'])
+ >>> commonpath(pathsplit("./plugins/builtins"), pathsplit("./plugins"))
+ (['.', 'plugins'], ['builtins'], [])
+ """
+ if common is None:
+ common = []
+
+ if l1 == l2:
+ return l1, [], []
+
+ for i, (leftDir, rightDir) in enumerate(zip(l1, l2)):
+ if leftDir != rightDir:
+ return l1[0:i], l1[i:], l2[i:]
+ else:
+ if leftDir == rightDir:
+ i += 1
+ return l1[0:i], l1[i:], l2[i:]
+
+
+def relpath(p1, p2):
+ """
+ >>> relpath('/', '/')
+ './'
+ >>> relpath('/a/b/c/d', '/')
+ '../../../../'
+ >>> relpath('/a/b/c/d', '/a/b/c1/d1')
+ '../../c1/d1'
+ >>> relpath('/a/b/c/d', '/a/b/c1/d1/')
+ '../../c1/d1'
+ >>> relpath("./plugins/builtins", "./plugins")
+ '../'
+ >>> relpath("./plugins/", "./plugins/builtins.ini")
+ 'builtins.ini'
+ """
+ sourcePath = os.path.normpath(p1)
+ destPath = os.path.normpath(p2)
+
+ (common, sourceOnly, destOnly) = commonpath(pathsplit(sourcePath), pathsplit(destPath))
+ if len(sourceOnly) or len(destOnly):
+ relParts = itertools.chain(
+ (('..' + os.sep) * len(sourceOnly), ),
+ destOnly,
+ )
+ return os.path.join(*relParts)
+ else:
+ return "."+os.sep
--- /dev/null
+#!/usr/bin/env python
+
+
+import logging
+
+
+def set_process_name(name):
+ try: # change process name for killall
+ import ctypes
+ libc = ctypes.CDLL('libc.so.6')
+ libc.prctl(15, name, 0, 0, 0)
+ except Exception, e:
+ logging.warning('Unable to set processName: %s" % e')
--- /dev/null
+#!/usr/bin/env python
+
+from __future__ import with_statement
+
+import sys
+import cPickle
+
+import functools
+import contextlib
+import inspect
+
+import optparse
+import traceback
+import warnings
+import string
+
+
+def printfmt(template):
+ """
+ This hides having to create the Template object and call substitute/safe_substitute on it. For example:
+
+ >>> num = 10
+ >>> word = "spam"
+ >>> printfmt("I would like to order $num units of $word, please") #doctest: +SKIP
+ I would like to order 10 units of spam, please
+ """
+ frame = inspect.stack()[-1][0]
+ try:
+ print string.Template(template).safe_substitute(frame.f_locals)
+ finally:
+ del frame
+
+
+def is_special(name):
+ return name.startswith("__") and name.endswith("__")
+
+
+def is_private(name):
+ return name.startswith("_") and not is_special(name)
+
+
+def privatize(clsName, attributeName):
+ """
+ At runtime, make an attributeName private
+
+ Example:
+ >>> class Test(object):
+ ... pass
+ ...
+ >>> try:
+ ... dir(Test).index("_Test__me")
+ ... print dir(Test)
+ ... except:
+ ... print "Not Found"
+ Not Found
+ >>> setattr(Test, privatize(Test.__name__, "me"), "Hello World")
+ >>> try:
+ ... dir(Test).index("_Test__me")
+ ... print "Found"
+ ... except:
+ ... print dir(Test)
+ 0
+ Found
+ >>> print getattr(Test, obfuscate(Test.__name__, "__me"))
+ Hello World
+ >>>
+ >>> is_private(privatize(Test.__name__, "me"))
+ True
+ >>> is_special(privatize(Test.__name__, "me"))
+ False
+ """
+ return "".join(["_", clsName, "__", attributeName])
+
+
+def obfuscate(clsName, attributeName):
+ """
+ At runtime, turn a private name into the obfuscated form
+
+ Example:
+ >>> class Test(object):
+ ... __me = "Hello World"
+ ...
+ >>> try:
+ ... dir(Test).index("_Test__me")
+ ... print "Found"
+ ... except:
+ ... print dir(Test)
+ 0
+ Found
+ >>> print getattr(Test, obfuscate(Test.__name__, "__me"))
+ Hello World
+ >>> is_private(obfuscate(Test.__name__, "__me"))
+ True
+ >>> is_special(obfuscate(Test.__name__, "__me"))
+ False
+ """
+ return "".join(["_", clsName, attributeName])
+
+
+class PAOptionParser(optparse.OptionParser, object):
+ """
+ >>> if __name__ == '__main__':
+ ... #parser = PAOptionParser("My usage str")
+ ... parser = PAOptionParser()
+ ... parser.add_posarg("Foo", help="Foo usage")
+ ... parser.add_posarg("Bar", dest="bar_dest")
+ ... parser.add_posarg("Language", dest='tr_type', type="choice", choices=("Python", "Other"))
+ ... parser.add_option('--stocksym', dest='symbol')
+ ... values, args = parser.parse_args()
+ ... print values, args
+ ...
+
+ python mycp.py -h
+ python mycp.py
+ python mycp.py foo
+ python mycp.py foo bar
+
+ python mycp.py foo bar lava
+ Usage: pa.py <Foo> <Bar> <Language> [options]
+
+ Positional Arguments:
+ Foo: Foo usage
+ Bar:
+ Language:
+
+ pa.py: error: option --Language: invalid choice: 'lava' (choose from 'Python', 'Other'
+ """
+
+ def __init__(self, *args, **kw):
+ self.posargs = []
+ super(PAOptionParser, self).__init__(*args, **kw)
+
+ def add_posarg(self, *args, **kw):
+ pa_help = kw.get("help", "")
+ kw["help"] = optparse.SUPPRESS_HELP
+ o = self.add_option("--%s" % args[0], *args[1:], **kw)
+ self.posargs.append((args[0], pa_help))
+
+ def get_usage(self, *args, **kwargs):
+ params = (' '.join(["<%s>" % arg[0] for arg in self.posargs]), '\n '.join(["%s: %s" % (arg) for arg in self.posargs]))
+ self.usage = "%%prog %s [options]\n\nPositional Arguments:\n %s" % params
+ return super(PAOptionParser, self).get_usage(*args, **kwargs)
+
+ def parse_args(self, *args, **kwargs):
+ args = sys.argv[1:]
+ args0 = []
+ for p, v in zip(self.posargs, args):
+ args0.append("--%s" % p[0])
+ args0.append(v)
+ args = args0 + args
+ options, args = super(PAOptionParser, self).parse_args(args, **kwargs)
+ if len(args) < len(self.posargs):
+ msg = 'Missing value(s) for "%s"\n' % ", ".join([arg[0] for arg in self.posargs][len(args):])
+ self.error(msg)
+ return options, args
+
+
+def explicitly(name, stackadd=0):
+ """
+ This is an alias for adding to '__all__'. Less error-prone than using
+ __all__ itself, since setting __all__ directly is prone to stomping on
+ things implicitly exported via L{alias}.
+
+ @note Taken from PyExport (which could turn out pretty cool):
+ @li @a http://codebrowse.launchpad.net/~glyph/
+ @li @a http://glyf.livejournal.com/74356.html
+ """
+ packageVars = sys._getframe(1+stackadd).f_locals
+ globalAll = packageVars.setdefault('__all__', [])
+ globalAll.append(name)
+
+
+def public(thunk):
+ """
+ This is a decorator, for convenience. Rather than typing the name of your
+ function twice, you can decorate a function with this.
+
+ To be real, @public would need to work on methods as well, which gets into
+ supporting types...
+
+ @note Taken from PyExport (which could turn out pretty cool):
+ @li @a http://codebrowse.launchpad.net/~glyph/
+ @li @a http://glyf.livejournal.com/74356.html
+ """
+ explicitly(thunk.__name__, 1)
+ return thunk
+
+
+def _append_docstring(obj, message):
+ if obj.__doc__ is None:
+ obj.__doc__ = message
+ else:
+ obj.__doc__ += message
+
+
+def validate_decorator(decorator):
+
+ def simple(x):
+ return x
+
+ f = simple
+ f.__name__ = "name"
+ f.__doc__ = "doc"
+ f.__dict__["member"] = True
+
+ g = decorator(f)
+
+ if f.__name__ != g.__name__:
+ print f.__name__, "!=", g.__name__
+
+ if g.__doc__ is None:
+ print decorator.__name__, "has no doc string"
+ elif not g.__doc__.startswith(f.__doc__):
+ print g.__doc__, "didn't start with", f.__doc__
+
+ if not ("member" in g.__dict__ and g.__dict__["member"]):
+ print "'member' not in ", g.__dict__
+
+
+def deprecated_api(func):
+ """
+ This is a decorator which can be used to mark functions
+ as deprecated. It will result in a warning being emitted
+ when the function is used.
+
+ >>> validate_decorator(deprecated_api)
+ """
+
+ @functools.wraps(func)
+ def newFunc(*args, **kwargs):
+ warnings.warn("Call to deprecated function %s." % func.__name__, category=DeprecationWarning)
+ return func(*args, **kwargs)
+ _append_docstring(newFunc, "\n@deprecated")
+ return newFunc
+
+
+def unstable_api(func):
+ """
+ This is a decorator which can be used to mark functions
+ as deprecated. It will result in a warning being emitted
+ when the function is used.
+
+ >>> validate_decorator(unstable_api)
+ """
+
+ @functools.wraps(func)
+ def newFunc(*args, **kwargs):
+ warnings.warn("Call to unstable API function %s." % func.__name__, category=FutureWarning)
+ return func(*args, **kwargs)
+ _append_docstring(newFunc, "\n@unstable")
+ return newFunc
+
+
+def enabled(func):
+ """
+ This decorator doesn't add any behavior
+
+ >>> validate_decorator(enabled)
+ """
+ return func
+
+
+def disabled(func):
+ """
+ This decorator disables the provided function, and does nothing
+
+ >>> validate_decorator(disabled)
+ """
+
+ @functools.wraps(func)
+ def emptyFunc(*args, **kargs):
+ pass
+ _append_docstring(emptyFunc, "\n@note Temporarily Disabled")
+ return emptyFunc
+
+
+def metadata(document=True, **kwds):
+ """
+ >>> validate_decorator(metadata(author="Ed"))
+ """
+
+ def decorate(func):
+ for k, v in kwds.iteritems():
+ setattr(func, k, v)
+ if document:
+ _append_docstring(func, "\n@"+k+" "+v)
+ return func
+ return decorate
+
+
+def prop(func):
+ """Function decorator for defining property attributes
+
+ The decorated function is expected to return a dictionary
+ containing one or more of the following pairs:
+ fget - function for getting attribute value
+ fset - function for setting attribute value
+ fdel - function for deleting attribute
+ This can be conveniently constructed by the locals() builtin
+ function; see:
+ http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/205183
+ @author http://kbyanc.blogspot.com/2007/06/python-property-attribute-tricks.html
+
+ Example:
+ >>> #Due to transformation from function to property, does not need to be validated
+ >>> #validate_decorator(prop)
+ >>> class MyExampleClass(object):
+ ... @prop
+ ... def foo():
+ ... "The foo property attribute's doc-string"
+ ... def fget(self):
+ ... print "GET"
+ ... return self._foo
+ ... def fset(self, value):
+ ... print "SET"
+ ... self._foo = value
+ ... return locals()
+ ...
+ >>> me = MyExampleClass()
+ >>> me.foo = 10
+ SET
+ >>> print me.foo
+ GET
+ 10
+ """
+ return property(doc=func.__doc__, **func())
+
+
+def print_handler(e):
+ """
+ @see ExpHandler
+ """
+ print "%s: %s" % (type(e).__name__, e)
+
+
+def print_ignore(e):
+ """
+ @see ExpHandler
+ """
+ print 'Ignoring %s exception: %s' % (type(e).__name__, e)
+
+
+def print_traceback(e):
+ """
+ @see ExpHandler
+ """
+ #print sys.exc_info()
+ traceback.print_exc(file=sys.stdout)
+
+
+def ExpHandler(handler = print_handler, *exceptions):
+ """
+ An exception handling idiom using decorators
+ Examples
+ Specify exceptions in order, first one is handled first
+ last one last.
+
+ >>> validate_decorator(ExpHandler())
+ >>> @ExpHandler(print_ignore, ZeroDivisionError)
+ ... @ExpHandler(None, AttributeError, ValueError)
+ ... def f1():
+ ... 1/0
+ >>> @ExpHandler(print_traceback, ZeroDivisionError)
+ ... def f2():
+ ... 1/0
+ >>> @ExpHandler()
+ ... def f3(*pargs):
+ ... l = pargs
+ ... return l[10]
+ >>> @ExpHandler(print_traceback, ZeroDivisionError)
+ ... def f4():
+ ... return 1
+ >>>
+ >>>
+ >>> f1()
+ Ignoring ZeroDivisionError exception: integer division or modulo by zero
+ >>> f2() # doctest: +ELLIPSIS, +NORMALIZE_WHITESPACE
+ Traceback (most recent call last):
+ ...
+ ZeroDivisionError: integer division or modulo by zero
+ >>> f3()
+ IndexError: tuple index out of range
+ >>> f4()
+ 1
+ """
+
+ def wrapper(f):
+ localExceptions = exceptions
+ if not localExceptions:
+ localExceptions = [Exception]
+ t = [(ex, handler) for ex in localExceptions]
+ t.reverse()
+
+ def newfunc(t, *args, **kwargs):
+ ex, handler = t[0]
+ try:
+ if len(t) == 1:
+ return f(*args, **kwargs)
+ else:
+ #Recurse for embedded try/excepts
+ dec_func = functools.partial(newfunc, t[1:])
+ dec_func = functools.update_wrapper(dec_func, f)
+ return dec_func(*args, **kwargs)
+ except ex, e:
+ return handler(e)
+
+ dec_func = functools.partial(newfunc, t)
+ dec_func = functools.update_wrapper(dec_func, f)
+ return dec_func
+ return wrapper
+
+
+class bindclass(object):
+ """
+ >>> validate_decorator(bindclass)
+ >>> class Foo(BoundObject):
+ ... @bindclass
+ ... def foo(this_class, self):
+ ... return this_class, self
+ ...
+ >>> class Bar(Foo):
+ ... @bindclass
+ ... def bar(this_class, self):
+ ... return this_class, self
+ ...
+ >>> f = Foo()
+ >>> b = Bar()
+ >>>
+ >>> f.foo() # doctest: +ELLIPSIS
+ (<class '...Foo'>, <...Foo object at ...>)
+ >>> b.foo() # doctest: +ELLIPSIS
+ (<class '...Foo'>, <...Bar object at ...>)
+ >>> b.bar() # doctest: +ELLIPSIS
+ (<class '...Bar'>, <...Bar object at ...>)
+ """
+
+ def __init__(self, f):
+ self.f = f
+ self.__name__ = f.__name__
+ self.__doc__ = f.__doc__
+ self.__dict__.update(f.__dict__)
+ self.m = None
+
+ def bind(self, cls, attr):
+
+ def bound_m(*args, **kwargs):
+ return self.f(cls, *args, **kwargs)
+ bound_m.__name__ = attr
+ self.m = bound_m
+
+ def __get__(self, obj, objtype=None):
+ return self.m.__get__(obj, objtype)
+
+
+class ClassBindingSupport(type):
+ "@see bindclass"
+
+ def __init__(mcs, name, bases, attrs):
+ type.__init__(mcs, name, bases, attrs)
+ for attr, val in attrs.iteritems():
+ if isinstance(val, bindclass):
+ val.bind(mcs, attr)
+
+
+class BoundObject(object):
+ "@see bindclass"
+ __metaclass__ = ClassBindingSupport
+
+
+def bindfunction(f):
+ """
+ >>> validate_decorator(bindfunction)
+ >>> @bindfunction
+ ... def factorial(thisfunction, n):
+ ... # Within this function the name 'thisfunction' refers to the factorial
+ ... # function(with only one argument), even after 'factorial' is bound
+ ... # to another object
+ ... if n > 0:
+ ... return n * thisfunction(n - 1)
+ ... else:
+ ... return 1
+ ...
+ >>> factorial(3)
+ 6
+ """
+
+ @functools.wraps(f)
+ def bound_f(*args, **kwargs):
+ return f(bound_f, *args, **kwargs)
+ return bound_f
+
+
+class Memoize(object):
+ """
+ Memoize(fn) - an instance which acts like fn but memoizes its arguments
+ Will only work on functions with non-mutable arguments
+ @note Source: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52201
+
+ >>> validate_decorator(Memoize)
+ """
+
+ def __init__(self, fn):
+ self.fn = fn
+ self.__name__ = fn.__name__
+ self.__doc__ = fn.__doc__
+ self.__dict__.update(fn.__dict__)
+ self.memo = {}
+
+ def __call__(self, *args):
+ if args not in self.memo:
+ self.memo[args] = self.fn(*args)
+ return self.memo[args]
+
+
+class MemoizeMutable(object):
+ """Memoize(fn) - an instance which acts like fn but memoizes its arguments
+ Will work on functions with mutable arguments(slower than Memoize)
+ @note Source: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52201
+
+ >>> validate_decorator(MemoizeMutable)
+ """
+
+ def __init__(self, fn):
+ self.fn = fn
+ self.__name__ = fn.__name__
+ self.__doc__ = fn.__doc__
+ self.__dict__.update(fn.__dict__)
+ self.memo = {}
+
+ def __call__(self, *args, **kw):
+ text = cPickle.dumps((args, kw))
+ if text not in self.memo:
+ self.memo[text] = self.fn(*args, **kw)
+ return self.memo[text]
+
+
+callTraceIndentationLevel = 0
+
+
+def call_trace(f):
+ """
+ Synchronization decorator.
+
+ >>> validate_decorator(call_trace)
+ >>> @call_trace
+ ... def a(a, b, c):
+ ... pass
+ >>> a(1, 2, c=3)
+ Entering a((1, 2), {'c': 3})
+ Exiting a((1, 2), {'c': 3})
+ """
+
+ @functools.wraps(f)
+ def verboseTrace(*args, **kw):
+ global callTraceIndentationLevel
+
+ print "%sEntering %s(%s, %s)" % ("\t"*callTraceIndentationLevel, f.__name__, args, kw)
+ callTraceIndentationLevel += 1
+ try:
+ result = f(*args, **kw)
+ except:
+ callTraceIndentationLevel -= 1
+ print "%sException %s(%s, %s)" % ("\t"*callTraceIndentationLevel, f.__name__, args, kw)
+ raise
+ callTraceIndentationLevel -= 1
+ print "%sExiting %s(%s, %s)" % ("\t"*callTraceIndentationLevel, f.__name__, args, kw)
+ return result
+
+ @functools.wraps(f)
+ def smallTrace(*args, **kw):
+ global callTraceIndentationLevel
+
+ print "%sEntering %s" % ("\t"*callTraceIndentationLevel, f.__name__)
+ callTraceIndentationLevel += 1
+ try:
+ result = f(*args, **kw)
+ except:
+ callTraceIndentationLevel -= 1
+ print "%sException %s" % ("\t"*callTraceIndentationLevel, f.__name__)
+ raise
+ callTraceIndentationLevel -= 1
+ print "%sExiting %s" % ("\t"*callTraceIndentationLevel, f.__name__)
+ return result
+
+ #return smallTrace
+ return verboseTrace
+
+
+@contextlib.contextmanager
+def lexical_scope(*args):
+ """
+ @note Source: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/520586
+ Example:
+ >>> b = 0
+ >>> with lexical_scope(1) as (a):
+ ... print a
+ ...
+ 1
+ >>> with lexical_scope(1,2,3) as (a,b,c):
+ ... print a,b,c
+ ...
+ 1 2 3
+ >>> with lexical_scope():
+ ... d = 10
+ ... def foo():
+ ... pass
+ ...
+ >>> print b
+ 2
+ """
+
+ frame = inspect.currentframe().f_back.f_back
+ saved = frame.f_locals.keys()
+ try:
+ if not args:
+ yield
+ elif len(args) == 1:
+ yield args[0]
+ else:
+ yield args
+ finally:
+ f_locals = frame.f_locals
+ for key in (x for x in f_locals.keys() if x not in saved):
+ del f_locals[key]
+ del frame
--- /dev/null
+#!/usr/bin/env python
+import new
+
+# Make the environment more like Python 3.0
+__metaclass__ = type
+from itertools import izip as zip
+import textwrap
+import inspect
+
+
+__all__ = [
+ "AnyType",
+ "overloaded"
+]
+
+
+AnyType = object
+
+
+class overloaded:
+ """
+ Dynamically overloaded functions.
+
+ This is an implementation of (dynamically, or run-time) overloaded
+ functions; also known as generic functions or multi-methods.
+
+ The dispatch algorithm uses the types of all argument for dispatch,
+ similar to (compile-time) overloaded functions or methods in C++ and
+ Java.
+
+ Most of the complexity in the algorithm comes from the need to support
+ subclasses in call signatures. For example, if an function is
+ registered for a signature (T1, T2), then a call with a signature (S1,
+ S2) is acceptable, assuming that S1 is a subclass of T1, S2 a subclass
+ of T2, and there are no other more specific matches (see below).
+
+ If there are multiple matches and one of those doesn't *dominate* all
+ others, the match is deemed ambiguous and an exception is raised. A
+ subtlety here: if, after removing the dominated matches, there are
+ still multiple matches left, but they all map to the same function,
+ then the match is not deemed ambiguous and that function is used.
+ Read the method find_func() below for details.
+
+ @note Python 2.5 is required due to the use of predicates any() and all().
+ @note only supports positional arguments
+
+ @author http://www.artima.com/weblogs/viewpost.jsp?thread=155514
+
+ >>> import misc
+ >>> misc.validate_decorator (overloaded)
+ >>>
+ >>>
+ >>>
+ >>>
+ >>> #################
+ >>> #Basics, with reusing names and without
+ >>> @overloaded
+ ... def foo(x):
+ ... "prints x"
+ ... print x
+ ...
+ >>> @foo.register(int)
+ ... def foo(x):
+ ... "prints the hex representation of x"
+ ... print hex(x)
+ ...
+ >>> from types import DictType
+ >>> @foo.register(DictType)
+ ... def foo_dict(x):
+ ... "prints the keys of x"
+ ... print [k for k in x.iterkeys()]
+ ...
+ >>> #combines all of the doc strings to help keep track of the specializations
+ >>> foo.__doc__ # doctest: +ELLIPSIS
+ "prints x\\n\\n...overloading.foo (<type 'int'>):\\n\\tprints the hex representation of x\\n\\n...overloading.foo_dict (<type 'dict'>):\\n\\tprints the keys of x"
+ >>> foo ("text")
+ text
+ >>> foo (10) #calling the specialized foo
+ 0xa
+ >>> foo ({3:5, 6:7}) #calling the specialization foo_dict
+ [3, 6]
+ >>> foo_dict ({3:5, 6:7}) #with using a unique name, you still have the option of calling the function directly
+ [3, 6]
+ >>>
+ >>>
+ >>>
+ >>>
+ >>> #################
+ >>> #Multiple arguments, accessing the default, and function finding
+ >>> @overloaded
+ ... def two_arg (x, y):
+ ... print x,y
+ ...
+ >>> @two_arg.register(int, int)
+ ... def two_arg_int_int (x, y):
+ ... print hex(x), hex(y)
+ ...
+ >>> @two_arg.register(float, int)
+ ... def two_arg_float_int (x, y):
+ ... print x, hex(y)
+ ...
+ >>> @two_arg.register(int, float)
+ ... def two_arg_int_float (x, y):
+ ... print hex(x), y
+ ...
+ >>> two_arg.__doc__ # doctest: +ELLIPSIS
+ "...overloading.two_arg_int_int (<type 'int'>, <type 'int'>):\\n\\n...overloading.two_arg_float_int (<type 'float'>, <type 'int'>):\\n\\n...overloading.two_arg_int_float (<type 'int'>, <type 'float'>):"
+ >>> two_arg(9, 10)
+ 0x9 0xa
+ >>> two_arg(9.0, 10)
+ 9.0 0xa
+ >>> two_arg(15, 16.0)
+ 0xf 16.0
+ >>> two_arg.default_func(9, 10)
+ 9 10
+ >>> two_arg.find_func ((int, float)) == two_arg_int_float
+ True
+ >>> (int, float) in two_arg
+ True
+ >>> (str, int) in two_arg
+ False
+ >>>
+ >>>
+ >>>
+ >>> #################
+ >>> #wildcard
+ >>> @two_arg.register(AnyType, str)
+ ... def two_arg_any_str (x, y):
+ ... print x, y.lower()
+ ...
+ >>> two_arg("Hello", "World")
+ Hello world
+ >>> two_arg(500, "World")
+ 500 world
+ """
+
+ def __init__(self, default_func):
+ # Decorator to declare new overloaded function.
+ self.registry = {}
+ self.cache = {}
+ self.default_func = default_func
+ self.__name__ = self.default_func.__name__
+ self.__doc__ = self.default_func.__doc__
+ self.__dict__.update (self.default_func.__dict__)
+
+ def __get__(self, obj, type=None):
+ if obj is None:
+ return self
+ return new.instancemethod(self, obj)
+
+ def register(self, *types):
+ """
+ Decorator to register an implementation for a specific set of types.
+
+ .register(t1, t2)(f) is equivalent to .register_func((t1, t2), f).
+ """
+
+ def helper(func):
+ self.register_func(types, func)
+
+ originalDoc = self.__doc__ if self.__doc__ is not None else ""
+ typeNames = ", ".join ([str(type) for type in types])
+ typeNames = "".join ([func.__module__+".", func.__name__, " (", typeNames, "):"])
+ overloadedDoc = ""
+ if func.__doc__ is not None:
+ overloadedDoc = textwrap.fill (func.__doc__, width=60, initial_indent="\t", subsequent_indent="\t")
+ self.__doc__ = "\n".join ([originalDoc, "", typeNames, overloadedDoc]).strip()
+
+ new_func = func
+
+ #Masking the function, so we want to take on its traits
+ if func.__name__ == self.__name__:
+ self.__dict__.update (func.__dict__)
+ new_func = self
+ return new_func
+
+ return helper
+
+ def register_func(self, types, func):
+ """Helper to register an implementation."""
+ self.registry[tuple(types)] = func
+ self.cache = {} # Clear the cache (later we can optimize this).
+
+ def __call__(self, *args):
+ """Call the overloaded function."""
+ types = tuple(map(type, args))
+ func = self.cache.get(types)
+ if func is None:
+ self.cache[types] = func = self.find_func(types)
+ return func(*args)
+
+ def __contains__ (self, types):
+ return self.find_func(types) is not self.default_func
+
+ def find_func(self, types):
+ """Find the appropriate overloaded function; don't call it.
+
+ @note This won't work for old-style classes or classes without __mro__
+ """
+ func = self.registry.get(types)
+ if func is not None:
+ # Easy case -- direct hit in registry.
+ return func
+
+ # Phillip Eby suggests to use issubclass() instead of __mro__.
+ # There are advantages and disadvantages.
+
+ # I can't help myself -- this is going to be intense functional code.
+ # Find all possible candidate signatures.
+ mros = tuple(inspect.getmro(t) for t in types)
+ n = len(mros)
+ candidates = [sig for sig in self.registry
+ if len(sig) == n and
+ all(t in mro for t, mro in zip(sig, mros))]
+
+ if not candidates:
+ # No match at all -- use the default function.
+ return self.default_func
+ elif len(candidates) == 1:
+ # Unique match -- that's an easy case.
+ return self.registry[candidates[0]]
+
+ # More than one match -- weed out the subordinate ones.
+
+ def dominates(dom, sub,
+ orders=tuple(dict((t, i) for i, t in enumerate(mro))
+ for mro in mros)):
+ # Predicate to decide whether dom strictly dominates sub.
+ # Strict domination is defined as domination without equality.
+ # The arguments dom and sub are type tuples of equal length.
+ # The orders argument is a precomputed auxiliary data structure
+ # giving dicts of ordering information corresponding to the
+ # positions in the type tuples.
+ # A type d dominates a type s iff order[d] <= order[s].
+ # A type tuple (d1, d2, ...) dominates a type tuple of equal length
+ # (s1, s2, ...) iff d1 dominates s1, d2 dominates s2, etc.
+ if dom is sub:
+ return False
+ return all(order[d] <= order[s] for d, s, order in zip(dom, sub, orders))
+
+ # I suppose I could inline dominates() but it wouldn't get any clearer.
+ candidates = [cand
+ for cand in candidates
+ if not any(dominates(dom, cand) for dom in candidates)]
+ if len(candidates) == 1:
+ # There's exactly one candidate left.
+ return self.registry[candidates[0]]
+
+ # Perhaps these multiple candidates all have the same implementation?
+ funcs = set(self.registry[cand] for cand in candidates)
+ if len(funcs) == 1:
+ return funcs.pop()
+
+ # No, the situation is irreducibly ambiguous.
+ raise TypeError("ambigous call; types=%r; candidates=%r" %
+ (types, candidates))
--- /dev/null
+#!/usr/bin/python2.5
+
+"""
+@bug In update desrcription stuff
+"""
+
+import os
+import sys
+
+try:
+ import py2deb
+except ImportError:
+ import fake_py2deb as py2deb
+
+import constants
+
+
+__appname__ = constants.__app_name__
+__description__ = "Google Voice Communication Plugin"
+__author__ = "Ed Page"
+__email__ = "eopage@byu.net"
+__version__ = constants.__version__
+__build__ = constants.__build__
+__changelog__ = """
+0.1.0
+* Initial release
+"""
+
+
+__postinstall__ = """#!/bin/sh -e
+
+gtk-update-icon-cache -f /usr/share/icons/hicolor
+"""
+
+def find_files(path):
+ for root, dirs, files in os.walk(path):
+ for file in files:
+ if file.startswith("src-"):
+ fileParts = file.split("-")
+ unused, relPathParts, newName = fileParts[0], fileParts[1:-1], fileParts[-1]
+ assert unused == "src"
+ relPath = os.sep.join(relPathParts)
+ yield relPath, file, newName
+
+
+def unflatten_files(files):
+ d = {}
+ for relPath, oldName, newName in files:
+ if relPath not in d:
+ d[relPath] = []
+ d[relPath].append((oldName, newName))
+ return d
+
+
+def build_package(distribution):
+ try:
+ os.chdir(os.path.dirname(sys.argv[0]))
+ except:
+ pass
+
+ py2deb.Py2deb.SECTIONS = py2deb.SECTIONS_BY_POLICY[distribution]
+ p = py2deb.Py2deb(__appname__)
+ p.description = __description__
+ p.upgradeDescription = __changelog__.split("\n\n", 1)[0]
+ p.author = __author__
+ p.mail = __email__
+ p.license = "lgpl"
+ p.depends = ", ".join([
+ "python (>= 2.5)",
+ "python-dbus",
+ "python-gobject",
+ "python-telepathy",
+ ])
+ p.section = {
+ "debian": "comm",
+ "chinook": "communication",
+ "diablo": "user/network",
+ "fremantle": "user/network",
+ "mer": "user/network",
+ }[distribution]
+ p.arch = "all"
+ p.urgency = "low"
+ p.distribution = "chinook diablo fremantle mer debian"
+ p.repository = "extras"
+ p.changelog = __changelog__
+ p.postinstall = __postinstall__
+ p.icon = {
+ "debian": "26x26-theonering.png",
+ "chinook": "26x26-theonering.png",
+ "diablo": "26x26-theonering.png",
+ "fremantle": "64x64-theonering.png", # Fremantle natively uses 48x48
+ "mer": "64x64-theonering.png",
+ }[distribution]
+ for relPath, files in unflatten_files(find_files(".")).iteritems():
+ fullPath = "/usr/lib/theonering"
+ if relPath:
+ fullPath += os.sep+relPath
+ p[fullPath] = list(
+ "|".join((oldName, newName))
+ for (oldName, newName) in files
+ )
+ p["/usr/share/dbus-1/services"] = ["org.freedesktop.Telepathy.ConnectionManager.theonering.service.in"]
+ p["/usr/share/telepathy/managers"] = ["theonering.manager"]
+ p["/usr/share/icons/hicolor/26x26/hildon"] = ["26x26-theonering.png|theonering.png"]
+ p["/usr/share/icons/hicolor/64x64/hildon"] = ["64x64-theonering.png|theonering.png"]
+ p["/usr/share/icons/hicolor/scalable/hildon"] = ["scale-theonering.png|theonering.png"]
+
+ print p
+ print p.generate(
+ version="%s-%s" % (__version__, __build__),
+ changelog=__changelog__,
+ build=False,
+ tar=True,
+ changes=True,
+ dsc=True,
+ )
+ print "Building for %s finished" % distribution
+
+
+if __name__ == "__main__":
+ if len(sys.argv) > 1:
+ try:
+ import optparse
+ except ImportError:
+ optparse = None
+
+ if optparse is not None:
+ parser = optparse.OptionParser()
+ (commandOptions, commandArgs) = parser.parse_args()
+ else:
+ commandArgs = None
+ commandArgs = ["diablo"]
+ build_package(commandArgs[0])
--- /dev/null
+import pprint
+
+
+class Py2deb(object):
+
+ def __init__(self, appName):
+ self._appName = appName
+ self.description = ""
+ self.author = ""
+ self.mail = ""
+ self.license = ""
+ self.depends = ""
+ self.section = ""
+ self.arch = ""
+ self.ugency = ""
+ self.distribution = ""
+ self.repository = ""
+ self.changelog = ""
+ self.postinstall = ""
+ self.icon = ""
+ self._install = {}
+
+ def generate(self, appVersion, appBuild, changelog, tar, dsc, changes, build, src):
+ return """
+Package: %s
+version: %s-%s
+Changes:
+%s
+
+Build Options:
+ Tar: %s
+ Dsc: %s
+ Changes: %s
+ Build: %s
+ Src: %s
+ """ % (
+ self._appName, appVersion, appBuild, changelog, tar, dsc, changes, build, src
+ )
+
+ def __str__(self):
+ parts = []
+ parts.append("%s Package Settings:" % (self._appName, ))
+ for settingName in dir(self):
+ if settingName.startswith("_"):
+ continue
+ parts.append("\t%s: %s" % (settingName, getattr(self, settingName)))
+
+ parts.append(pprint.pformat(self._install))
+
+ return "\n".join(parts)
+
+ def __getitem__(self, key):
+ return self._install[key]
+
+ def __setitem__(self, key, item):
+ self._install[key] = item
--- /dev/null
+[D-BUS Service]
+Name=org.freedesktop.Telepathy.ConnectionManager.theonering
+Exec=@LIBEXECDIR@/telepathy-theonering
--- /dev/null
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+##
+## Copyright (C) 2009 manatlan manatlan[at]gmail(dot)com
+##
+## This program is free software; you can redistribute it and/or modify
+## it under the terms of the GNU General Public License as published
+## by the Free Software Foundation; version 2 only.
+##
+## This program is distributed in the hope that it will be useful,
+## but WITHOUT ANY WARRANTY; without even the implied warranty of
+## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+## GNU General Public License for more details.
+##
+"""
+Known limitations :
+- don't sign package (-us -uc)
+- no distinctions between author and maintainer(packager)
+
+depends on :
+- dpkg-dev (dpkg-buildpackage)
+- alien
+- python
+- fakeroot
+
+changelog
+ - ??? ?/??/20?? (By epage)
+ - PEP8
+ - added recommends
+ - fixed bug where it couldn't handle the contents of the pre/post scripts being specified
+ - Added customization based on the targeted policy for sections (Maemo support)
+ - Added maemo specific tarball, dsc, changes file generation support (including icon support)
+ - Added armel architecture
+ - Reduced the size of params being passed around by reducing the calls to locals()
+ - Added respository, distribution, priority
+ - Made setting control file a bit more flexible
+ - 0.5 05/09/2009
+ - pre/post install/remove scripts enabled
+ - deb package install py2deb in dist-packages for py2.6
+ - 0.4 14/10/2008
+ - use os.environ USERNAME or USER (debian way)
+ - install on py 2.(4,5,6) (*FIX* do better here)
+
+"""
+
+import os
+import hashlib
+import sys
+import shutil
+import time
+import string
+import StringIO
+import stat
+import commands
+import base64
+import tarfile
+from glob import glob
+from datetime import datetime
+import socket # gethostname()
+from subprocess import Popen, PIPE
+
+#~ __version__ = "0.4"
+__version__ = "0.5"
+__author__ = "manatlan"
+__mail__ = "manatlan@gmail.com"
+
+
+PERMS_URW_GRW_OR = stat.S_IRUSR | stat.S_IWUSR | \
+ stat.S_IRGRP | stat.S_IWGRP | \
+ stat.S_IROTH
+
+UID_ROOT = 0
+GID_ROOT = 0
+
+
+def run(cmds):
+ p = Popen(cmds, shell=False, stdout=PIPE, stderr=PIPE)
+ time.sleep(0.01) # to avoid "IOError: [Errno 4] Interrupted system call"
+ out = string.join(p.stdout.readlines()).strip()
+ outerr = string.join(p.stderr.readlines()).strip()
+ return out
+
+
+def deb2rpm(file):
+ txt=run(['alien', '-r', file])
+ return txt.split(" generated")[0]
+
+
+def py2src(TEMP, name):
+ l=glob("%(TEMP)s/%(name)s*.tar.gz" % locals())
+ if len(l) != 1:
+ raise Py2debException("don't find source package tar.gz")
+
+ tar = os.path.basename(l[0])
+ shutil.move(l[0], tar)
+
+ return tar
+
+
+def md5sum(filename):
+ f = open(filename, "r")
+ try:
+ return hashlib.md5(f.read()).hexdigest()
+ finally:
+ f.close()
+
+
+class Py2changes(object):
+
+ def __init__(self, ChangedBy, description, changes, files, category, repository, **kwargs):
+ self.options = kwargs # TODO: Is order important?
+ self.description = description
+ self.changes=changes
+ self.files=files
+ self.category=category
+ self.repository=repository
+ self.ChangedBy=ChangedBy
+
+ def getContent(self):
+ content = ["%s: %s" % (k, v)
+ for k,v in self.options.iteritems()]
+
+ if self.description:
+ description=self.description.replace("\n","\n ")
+ content.append('Description: ')
+ content.append(' %s' % description)
+ if self.changes:
+ changes=self.changes.replace("\n","\n ")
+ content.append('Changes: ')
+ content.append(' %s' % changes)
+ if self.ChangedBy:
+ content.append("Changed-By: %s" % self.ChangedBy)
+
+ content.append('Files:')
+
+ for onefile in self.files:
+ md5 = md5sum(onefile)
+ size = os.stat(onefile).st_size.__str__()
+ content.append(' ' + md5 + ' ' + size + ' ' + self.category +' '+self.repository+' '+os.path.basename(onefile))
+
+ return "\n".join(content) + "\n\n"
+
+
+def py2changes(params):
+ changescontent = Py2changes(
+ "%(author)s <%(mail)s>" % params,
+ "%(description)s" % params,
+ "%(changelog)s" % params,
+ (
+ "%(TEMP)s/%(name)s_%(version)s.tar.gz" % params,
+ "%(TEMP)s/%(name)s_%(version)s.dsc" % params,
+ ),
+ "%(section)s" % params,
+ "%(repository)s" % params,
+ Format='1.7',
+ Date=time.strftime("%a, %d %b %Y %H:%M:%S +0000", time.gmtime()),
+ Source="%(name)s" % params,
+ Architecture="%(arch)s" % params,
+ Version="%(version)s" % params,
+ Distribution="%(distribution)s" % params,
+ Urgency="%(urgency)s" % params,
+ Maintainer="%(author)s <%(mail)s>" % params
+ )
+ f = open("%(TEMP)s/%(name)s_%(version)s.changes" % params,"wb")
+ f.write(changescontent.getContent())
+ f.close()
+
+ fileHandle = open('/tmp/py2deb2.tmp', 'w')
+ fileHandle.write('#!/bin/sh\n')
+ fileHandle.write("cd " +os.getcwd()+ "\n")
+ fileHandle.write("gpg --local-user %(mail)s --clearsign %(TEMP)s/%(name)s_%(version)s.changes\n" % params)
+ fileHandle.write("mv %(TEMP)s/%(name)s_%(version)s.changes.asc %(TEMP)s/%(name)s_%(version)s.changes\n" % params)
+ fileHandle.write('\nexit')
+ fileHandle.close()
+ commands.getoutput("chmod 777 /tmp/py2deb2.tmp")
+ commands.getoutput("/tmp/py2deb2.tmp")
+
+ ret = []
+
+ l=glob("%(TEMP)s/%(name)s*.tar.gz" % params)
+ if len(l)!=1:
+ raise Py2debException("don't find source package tar.gz")
+ tar = os.path.basename(l[0])
+ shutil.move(l[0],tar)
+ ret.append(tar)
+
+ l=glob("%(TEMP)s/%(name)s*.dsc" % params)
+ if len(l)!=1:
+ raise Py2debException("don't find source package dsc")
+ tar = os.path.basename(l[0])
+ shutil.move(l[0],tar)
+ ret.append(tar)
+
+ l=glob("%(TEMP)s/%(name)s*.changes" % params)
+ if len(l)!=1:
+ raise Py2debException("don't find source package changes")
+ tar = os.path.basename(l[0])
+ shutil.move(l[0],tar)
+ ret.append(tar)
+
+ return ret
+
+
+class Py2dsc(object):
+
+ def __init__(self, StandardsVersion, BuildDepends, files, **kwargs):
+ self.options = kwargs # TODO: Is order important?
+ self.StandardsVersion = StandardsVersion
+ self.BuildDepends=BuildDepends
+ self.files=files
+
+ @property
+ def content(self):
+ content = ["%s: %s" % (k, v)
+ for k,v in self.options.iteritems()]
+
+ if self.BuildDepends:
+ content.append("Build-Depends: %s" % self.BuildDepends)
+ if self.StandardsVersion:
+ content.append("Standards-Version: %s" % self.StandardsVersion)
+
+ content.append('Files:')
+
+ for onefile in self.files:
+ print onefile
+ md5 = md5sum(onefile)
+ size = os.stat(onefile).st_size.__str__()
+ content.append(' '+md5 + ' ' + size +' '+os.path.basename(onefile))
+
+ return "\n".join(content)+"\n\n"
+
+
+def py2dsc(TEMP, name, version, depends, author, mail, arch):
+ dsccontent = Py2dsc(
+ "%(version)s" % locals(),
+ "%(depends)s" % locals(),
+ ("%(TEMP)s/%(name)s_%(version)s.tar.gz" % locals(),),
+ Format='1.0',
+ Source="%(name)s" % locals(),
+ Version="%(version)s" % locals(),
+ Maintainer="%(author)s <%(mail)s>" % locals(),
+ Architecture="%(arch)s" % locals(),
+ )
+
+ filename = "%(TEMP)s/%(name)s_%(version)s.dsc" % locals()
+
+ f = open(filename, "wb")
+ try:
+ f.write(dsccontent.content)
+ finally:
+ f.close()
+
+ fileHandle = open('/tmp/py2deb.tmp', 'w')
+ try:
+ fileHandle.write('#!/bin/sh\n')
+ fileHandle.write("cd " + os.getcwd() + "\n")
+ fileHandle.write("gpg --local-user %(mail)s --clearsign %(TEMP)s/%(name)s_%(version)s.dsc\n" % locals())
+ fileHandle.write("mv %(TEMP)s/%(name)s_%(version)s.dsc.asc %(filename)s\n" % locals())
+ fileHandle.write('\nexit')
+ fileHandle.close()
+ finally:
+ f.close()
+
+ commands.getoutput("chmod 777 /tmp/py2deb.tmp")
+ commands.getoutput("/tmp/py2deb.tmp")
+
+ return filename
+
+
+class Py2tar(object):
+
+ def __init__(self, dataDirectoryPath):
+ self._dataDirectoryPath = dataDirectoryPath
+
+ def packed(self):
+ return self._getSourcesFiles()
+
+ def _getSourcesFiles(self):
+ directoryPath = self._dataDirectoryPath
+
+ outputFileObj = StringIO.StringIO() # TODO: Do more transparently?
+
+ tarOutput = tarfile.TarFile.open('sources',
+ mode = "w:gz",
+ fileobj = outputFileObj)
+
+ # Note: We can't use this because we need to fiddle permissions:
+ # tarOutput.add(directoryPath, arcname = "")
+
+ for root, dirs, files in os.walk(directoryPath):
+ archiveRoot = root[len(directoryPath):]
+
+ tarinfo = tarOutput.gettarinfo(root, archiveRoot)
+ # TODO: Make configurable?
+ tarinfo.uid = UID_ROOT
+ tarinfo.gid = GID_ROOT
+ tarinfo.uname = ""
+ tarinfo.gname = ""
+ tarOutput.addfile(tarinfo)
+
+ for f in files:
+ tarinfo = tarOutput.gettarinfo(os.path.join(root, f),
+ os.path.join(archiveRoot, f))
+ tarinfo.uid = UID_ROOT
+ tarinfo.gid = GID_ROOT
+ tarinfo.uname = ""
+ tarinfo.gname = ""
+ tarOutput.addfile(tarinfo, file(os.path.join(root, f)))
+
+ tarOutput.close()
+
+ data_tar_gz = outputFileObj.getvalue()
+
+ return data_tar_gz
+
+
+def py2tar(DEST, TEMP, name, version):
+ tarcontent = Py2tar("%(DEST)s" % locals())
+ filename = "%(TEMP)s/%(name)s_%(version)s.tar.gz" % locals()
+ f = open(filename, "wb")
+ try:
+ f.write(tarcontent.packed())
+ finally:
+ f.close()
+ return filename
+
+
+class Py2debException(Exception):
+ pass
+
+
+SECTIONS_BY_POLICY = {
+ # http://www.debian.org/doc/debian-policy/ch-archive.html#s-subsections
+ "debian": "admin, base, comm, contrib, devel, doc, editors, electronics, embedded, games, gnome, graphics, hamradio, interpreters, kde, libs, libdevel, mail, math, misc, net, news, non-free, oldlibs, otherosfs, perl, python, science, shells, sound, tex, text, utils, web, x11",
+ # http://maemo.org/forrest-images/pdf/maemo-policy.pdf
+ "chinook": "accessories, communication, games, multimedia, office, other, programming, support, themes, tools",
+ # http://wiki.maemo.org/Task:Package_categories
+ "diablo": "user/desktop, user/development, user/education, user/games, user/graphics, user/multimedia, user/navigation, user/network, user/office, user/science, user/system, user/utilities",
+ # http://wiki.maemo.org/Task:Fremantle_application_categories
+ "mer": "user/desktop, user/development, user/education, user/games, user/graphics, user/multimedia, user/navigation, user/network, user/office, user/science, user/system, user/utilities",
+ # http://wiki.maemo.org/Task:Fremantle_application_categories
+ "fremantle": "user/desktop, user/development, user/education, user/games, user/graphics, user/multimedia, user/navigation, user/network, user/office, user/science, user/system, user/utilities",
+}
+
+
+LICENSE_AGREEMENT = {
+ "gpl": """
+ This package is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+
+ This package is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with this package; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+
+On Debian systems, the complete text of the GNU General
+Public License can be found in `/usr/share/common-licenses/GPL'.
+""",
+ "lgpl":"""
+ This package is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2 of the License, or (at your option) any later version.
+
+ This package is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this package; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+
+On Debian systems, the complete text of the GNU Lesser General
+Public License can be found in `/usr/share/common-licenses/LGPL'.
+""",
+ "bsd": """
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted under the terms of the BSD License.
+
+ THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND
+ ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
+ OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
+ HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+ LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
+ OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
+ SUCH DAMAGE.
+
+On Debian systems, the complete text of the BSD License can be
+found in `/usr/share/common-licenses/BSD'.
+""",
+ "artistic": """
+ This program is free software; you can redistribute it and/or modify it
+ under the terms of the "Artistic License" which comes with Debian.
+
+ THIS PACKAGE IS PROVIDED "AS IS" AND WITHOUT ANY EXPRESS OR IMPLIED
+ WARRANTIES, INCLUDING, WITHOUT LIMITATION, THE IMPLIED WARRANTIES
+ OF MERCHANTIBILITY AND FITNESS FOR A PARTICULAR PURPOSE.
+
+On Debian systems, the complete text of the Artistic License
+can be found in `/usr/share/common-licenses/Artistic'.
+"""
+}
+
+
+class Py2deb(object):
+ """
+ heavily based on technic described here :
+ http://wiki.showmedo.com/index.php?title=LinuxJensMakingDeb
+ """
+ ## STATICS
+ clear = False # clear build folder after py2debianization
+
+ SECTIONS = SECTIONS_BY_POLICY["debian"]
+
+ #http://www.debian.org/doc/debian-policy/footnotes.html#f69
+ ARCHS = "all i386 ia64 alpha amd64 armeb arm hppa m32r m68k mips mipsel powerpc ppc64 s390 s390x sh3 sh3eb sh4 sh4eb sparc darwin-i386 darwin-ia64 darwin-alpha darwin-amd64 darwin-armeb darwin-arm darwin-hppa darwin-m32r darwin-m68k darwin-mips darwin-mipsel darwin-powerpc darwin-ppc64 darwin-s390 darwin-s390x darwin-sh3 darwin-sh3eb darwin-sh4 darwin-sh4eb darwin-sparc freebsd-i386 freebsd-ia64 freebsd-alpha freebsd-amd64 freebsd-armeb freebsd-arm freebsd-hppa freebsd-m32r freebsd-m68k freebsd-mips freebsd-mipsel freebsd-powerpc freebsd-ppc64 freebsd-s390 freebsd-s390x freebsd-sh3 freebsd-sh3eb freebsd-sh4 freebsd-sh4eb freebsd-sparc kfreebsd-i386 kfreebsd-ia64 kfreebsd-alpha kfreebsd-amd64 kfreebsd-armeb kfreebsd-arm kfreebsd-hppa kfreebsd-m32r kfreebsd-m68k kfreebsd-mips kfreebsd-mipsel kfreebsd-powerpc kfreebsd-ppc64 kfreebsd-s390 kfreebsd-s390x kfreebsd-sh3 kfreebsd-sh3eb kfreebsd-sh4 kfreebsd-sh4eb kfreebsd-sparc knetbsd-i386 knetbsd-ia64 knetbsd-alpha knetbsd-amd64 knetbsd-armeb knetbsd-arm knetbsd-hppa knetbsd-m32r knetbsd-m68k knetbsd-mips knetbsd-mipsel knetbsd-powerpc knetbsd-ppc64 knetbsd-s390 knetbsd-s390x knetbsd-sh3 knetbsd-sh3eb knetbsd-sh4 knetbsd-sh4eb knetbsd-sparc netbsd-i386 netbsd-ia64 netbsd-alpha netbsd-amd64 netbsd-armeb netbsd-arm netbsd-hppa netbsd-m32r netbsd-m68k netbsd-mips netbsd-mipsel netbsd-powerpc netbsd-ppc64 netbsd-s390 netbsd-s390x netbsd-sh3 netbsd-sh3eb netbsd-sh4 netbsd-sh4eb netbsd-sparc openbsd-i386 openbsd-ia64 openbsd-alpha openbsd-amd64 openbsd-armeb openbsd-arm openbsd-hppa openbsd-m32r openbsd-m68k openbsd-mips openbsd-mipsel openbsd-powerpc openbsd-ppc64 openbsd-s390 openbsd-s390x openbsd-sh3 openbsd-sh3eb openbsd-sh4 openbsd-sh4eb openbsd-sparc hurd-i386 hurd-ia64 hurd-alpha hurd-amd64 hurd-armeb hurd-arm hurd-hppa hurd-m32r hurd-m68k hurd-mips hurd-mipsel hurd-powerpc hurd-ppc64 hurd-s390 hurd-s390x hurd-sh3 hurd-sh3eb hurd-sh4 hurd-sh4eb hurd-sparc armel".split(" ")
+
+ # license terms taken from dh_make
+ LICENSES = list(LICENSE_AGREEMENT.iterkeys())
+
+ def __setitem__(self, path, files):
+
+ if not type(files)==list:
+ raise Py2debException("value of key path '%s' is not a list"%path)
+ if not files:
+ raise Py2debException("value of key path '%s' should'nt be empty"%path)
+ if not path.startswith("/"):
+ raise Py2debException("key path '%s' malformed (don't start with '/')"%path)
+ if path.endswith("/"):
+ raise Py2debException("key path '%s' malformed (shouldn't ends with '/')"%path)
+
+ nfiles=[]
+ for file in files:
+
+ if ".." in file:
+ raise Py2debException("file '%s' contains '..', please avoid that!"%file)
+
+
+ if "|" in file:
+ if file.count("|")!=1:
+ raise Py2debException("file '%s' is incorrect (more than one pipe)"%file)
+
+ file, nfile = file.split("|")
+ else:
+ nfile=file # same localisation
+
+ if os.path.isdir(file):
+ raise Py2debException("file '%s' is a folder, and py2deb refuse folders !"%file)
+
+ if not os.path.isfile(file):
+ raise Py2debException("file '%s' doesn't exist"%file)
+
+ if file.startswith("/"): # if an absolute file is defined
+ if file==nfile: # and not renamed (pipe trick)
+ nfile=os.path.basename(file) # it's simply copied to 'path'
+
+ nfiles.append((file, nfile))
+
+ nfiles.sort(lambda a, b: cmp(a[1], b[1])) #sort according new name (nfile)
+
+ self.__files[path]=nfiles
+
+ def __delitem__(self, k):
+ del self.__files[k]
+
+ def __init__(self,
+ name,
+ description="no description",
+ license="gpl",
+ depends="",
+ section="utils",
+ arch="all",
+
+ url="",
+ author = None,
+ mail = None,
+
+ preinstall = None,
+ postinstall = None,
+ preremove = None,
+ postremove = None
+ ):
+
+ if author is None:
+ author = ("USERNAME" in os.environ) and os.environ["USERNAME"] or None
+ if author is None:
+ author = ("USER" in os.environ) and os.environ["USER"] or "unknown"
+
+ if mail is None:
+ mail = author+"@"+socket.gethostname()
+
+ self.name = name
+ self.description = description
+ self.upgradeDescription = ""
+ self.license = license
+ self.depends = depends
+ self.recommends = ""
+ self.section = section
+ self.arch = arch
+ self.url = url
+ self.author = author
+ self.mail = mail
+ self.icon = ""
+ self.distribution = ""
+ self.respository = ""
+ self.urgency = "low"
+
+ self.preinstall = preinstall
+ self.postinstall = postinstall
+ self.preremove = preremove
+ self.postremove = postremove
+
+ self.__files={}
+
+ def __repr__(self):
+ name = self.name
+ license = self.license
+ description = self.description
+ depends = self.depends
+ recommends = self.recommends
+ section = self.section
+ arch = self.arch
+ url = self.url
+ author = self.author
+ mail = self.mail
+
+ preinstall = self.preinstall
+ postinstall = self.postinstall
+ preremove = self.preremove
+ postremove = self.postremove
+
+ paths=self.__files.keys()
+ paths.sort()
+ files=[]
+ for path in paths:
+ for file, nfile in self.__files[path]:
+ #~ rfile=os.path.normpath(os.path.join(path, nfile))
+ rfile=os.path.join(path, nfile)
+ if nfile==file:
+ files.append(rfile)
+ else:
+ files.append(rfile + " (%s)"%file)
+
+ files.sort()
+ files = "\n".join(files)
+
+
+ lscripts = [ preinstall and "preinst",
+ postinstall and "postinst",
+ preremove and "prerm",
+ postremove and "postrm",
+ ]
+ scripts = lscripts and ", ".join([i for i in lscripts if i]) or "None"
+ return """
+----------------------------------------------------------------------
+NAME : %(name)s
+----------------------------------------------------------------------
+LICENSE : %(license)s
+URL : %(url)s
+AUTHOR : %(author)s
+MAIL : %(mail)s
+----------------------------------------------------------------------
+DEPENDS : %(depends)s
+RECOMMENDS : %(recommends)s
+ARCH : %(arch)s
+SECTION : %(section)s
+----------------------------------------------------------------------
+DESCRIPTION :
+%(description)s
+----------------------------------------------------------------------
+SCRIPTS : %(scripts)s
+----------------------------------------------------------------------
+FILES :
+%(files)s
+""" % locals()
+
+ def generate(self, version, changelog="", rpm=False, src=False, build=True, tar=False, changes=False, dsc=False):
+ """ generate a deb of version 'version', with or without 'changelog', with or without a rpm
+ (in the current folder)
+ return a list of generated files
+ """
+ if not sum([len(i) for i in self.__files.values()])>0:
+ raise Py2debException("no files are defined")
+
+ if not changelog:
+ changelog="* no changelog"
+
+ name = self.name
+ description = self.description
+ license = self.license
+ depends = self.depends
+ recommends = self.recommends
+ section = self.section
+ arch = self.arch
+ url = self.url
+ distribution = self.distribution
+ repository = self.repository
+ urgency = self.urgency
+ author = self.author
+ mail = self.mail
+ files = self.__files
+ preinstall = self.preinstall
+ postinstall = self.postinstall
+ preremove = self.preremove
+ postremove = self.postremove
+
+ if section not in Py2deb.SECTIONS:
+ raise Py2debException("section '%s' is unknown (%s)" % (section, str(Py2deb.SECTIONS)))
+
+ if arch not in Py2deb.ARCHS:
+ raise Py2debException("arch '%s' is unknown (%s)"% (arch, str(Py2deb.ARCHS)))
+
+ if license not in Py2deb.LICENSES:
+ raise Py2debException("License '%s' is unknown (%s)" % (license, str(Py2deb.LICENSES)))
+
+ # create dates (buildDate, buildDateYear)
+ d=datetime.now()
+ buildDate=d.strftime("%a, %d %b %Y %H:%M:%S +0000")
+ buildDateYear=str(d.year)
+
+ #clean description (add a space before each next lines)
+ description=description.replace("\r", "").strip()
+ description = "\n ".join(description.split("\n"))
+
+ #clean changelog (add 2 spaces before each next lines)
+ changelog=changelog.replace("\r", "").strip()
+ changelog = "\n ".join(changelog.split("\n"))
+
+ TEMP = ".py2deb_build_folder"
+ DEST = os.path.join(TEMP, name)
+ DEBIAN = os.path.join(DEST, "debian")
+
+ packageContents = locals()
+
+ # let's start the process
+ try:
+ shutil.rmtree(TEMP)
+ except:
+ pass
+
+ os.makedirs(DEBIAN)
+ try:
+ rules=[]
+ dirs=[]
+ for path in files:
+ for ofile, nfile in files[path]:
+ if os.path.isfile(ofile):
+ # it's a file
+
+ if ofile.startswith("/"): # if absolute path
+ # we need to change dest
+ dest=os.path.join(DEST, nfile)
+ else:
+ dest=os.path.join(DEST, ofile)
+
+ # copy file to be packaged
+ destDir = os.path.dirname(dest)
+ if not os.path.isdir(destDir):
+ os.makedirs(destDir)
+
+ shutil.copy2(ofile, dest)
+
+ ndir = os.path.join(path, os.path.dirname(nfile))
+ nname = os.path.basename(nfile)
+
+ # make a line RULES to be sure the destination folder is created
+ # and one for copying the file
+ fpath = "/".join(["$(CURDIR)", "debian", name+ndir])
+ rules.append('mkdir -p "%s"' % fpath)
+ rules.append('cp -a "%s" "%s"' % (ofile, os.path.join(fpath, nname)))
+
+ # append a dir
+ dirs.append(ndir)
+
+ else:
+ raise Py2debException("unknown file '' "%ofile) # shouldn't be raised (because controlled before)
+
+ # make rules right
+ rules= "\n\t".join(rules) + "\n"
+ packageContents["rules"] = rules
+
+ # make dirs right
+ dirs= [i[1:] for i in set(dirs)]
+ dirs.sort()
+
+ #==========================================================================
+ # CREATE debian/dirs
+ #==========================================================================
+ open(os.path.join(DEBIAN, "dirs"), "w").write("\n".join(dirs))
+
+ #==========================================================================
+ # CREATE debian/changelog
+ #==========================================================================
+ clog="""%(name)s (%(version)s) stable; urgency=low
+
+ %(changelog)s
+
+ -- %(author)s <%(mail)s> %(buildDate)s
+""" % packageContents
+
+ open(os.path.join(DEBIAN, "changelog"), "w").write(clog)
+
+ #==========================================================================
+ #Create pre/post install/remove
+ #==========================================================================
+ def mkscript(name, dest):
+ if name and name.strip()!="":
+ if os.path.isfile(name): # it's a file
+ content = file(name).read()
+ else: # it's a script
+ content = name
+ open(os.path.join(DEBIAN, dest), "w").write(content)
+
+ mkscript(preinstall, "preinst")
+ mkscript(postinstall, "postinst")
+ mkscript(preremove, "prerm")
+ mkscript(postremove, "postrm")
+
+
+ #==========================================================================
+ # CREATE debian/compat
+ #==========================================================================
+ open(os.path.join(DEBIAN, "compat"), "w").write("5\n")
+
+ #==========================================================================
+ # CREATE debian/control
+ #==========================================================================
+ generalParagraphFields = [
+ "Source: %(name)s",
+ "Maintainer: %(author)s <%(mail)s>",
+ "Section: %(section)s",
+ "Priority: extra",
+ "Build-Depends: debhelper (>= 5)",
+ "Standards-Version: 3.7.2",
+ ]
+
+ specificParagraphFields = [
+ "Package: %(name)s",
+ "Architecture: %(arch)s",
+ "Depends: %(depends)s",
+ "Recommends: %(recommends)s",
+ "Description: %(description)s",
+ ]
+
+ if self.upgradeDescription:
+ upgradeDescription = "XB-Maemo-Upgrade-Description: %s" % self.upgradeDescription.strip()
+ specificParagraphFields.append("\n ".join(upgradeDescription.split("\n")))
+
+ if self.icon:
+ f = open(self.icon, "rb")
+ try:
+ rawIcon = f.read()
+ finally:
+ f.close()
+ uueIcon = base64.b64encode(rawIcon)
+ uueIconLines = []
+ for i, c in enumerate(uueIcon):
+ if i % 60 == 0:
+ uueIconLines.append("")
+ uueIconLines[-1] += c
+ uueIconLines[0:0] = ("XB-Maemo-Icon-26:", )
+ specificParagraphFields.append("\n ".join(uueIconLines))
+
+ generalParagraph = "\n".join(generalParagraphFields)
+ specificParagraph = "\n".join(specificParagraphFields)
+ controlContent = "\n\n".join((generalParagraph, specificParagraph)) % packageContents
+ open(os.path.join(DEBIAN, "control"), "w").write(controlContent)
+
+ #==========================================================================
+ # CREATE debian/copyright
+ #==========================================================================
+ packageContents["txtLicense"] = LICENSE_AGREEMENT[license]
+ packageContents["pv"] =__version__
+ txt="""This package was py2debianized(%(pv)s) by %(author)s <%(mail)s> on
+%(buildDate)s.
+
+It was downloaded from %(url)s
+
+Upstream Author: %(author)s <%(mail)s>
+
+Copyright: %(buildDateYear)s by %(author)s
+
+License:
+
+%(txtLicense)s
+
+The Debian packaging is (C) %(buildDateYear)s, %(author)s <%(mail)s> and
+is licensed under the GPL, see above.
+
+
+# Please also look if there are files or directories which have a
+# different copyright/license attached and list them here.
+""" % packageContents
+ open(os.path.join(DEBIAN, "copyright"), "w").write(txt)
+
+ #==========================================================================
+ # CREATE debian/rules
+ #==========================================================================
+ txt="""#!/usr/bin/make -f
+# -*- makefile -*-
+# Sample debian/rules that uses debhelper.
+# This file was originally written by Joey Hess and Craig Small.
+# As a special exception, when this file is copied by dh-make into a
+# dh-make output file, you may use that output file without restriction.
+# This special exception was added by Craig Small in version 0.37 of dh-make.
+
+# Uncomment this to turn on verbose mode.
+#export DH_VERBOSE=1
+
+
+
+
+CFLAGS = -Wall -g
+
+ifneq (,$(findstring noopt,$(DEB_BUILD_OPTIONS)))
+ CFLAGS += -O0
+else
+ CFLAGS += -O2
+endif
+
+configure: configure-stamp
+configure-stamp:
+ dh_testdir
+ # Add here commands to configure the package.
+
+ touch configure-stamp
+
+
+build: build-stamp
+
+build-stamp: configure-stamp
+ dh_testdir
+ touch build-stamp
+
+clean:
+ dh_testdir
+ dh_testroot
+ rm -f build-stamp configure-stamp
+ dh_clean
+
+install: build
+ dh_testdir
+ dh_testroot
+ dh_clean -k
+ dh_installdirs
+
+ # ======================================================
+ #$(MAKE) DESTDIR="$(CURDIR)/debian/%(name)s" install
+ mkdir -p "$(CURDIR)/debian/%(name)s"
+
+ %(rules)s
+ # ======================================================
+
+# Build architecture-independent files here.
+binary-indep: build install
+# We have nothing to do by default.
+
+# Build architecture-dependent files here.
+binary-arch: build install
+ dh_testdir
+ dh_testroot
+ dh_installchangelogs debian/changelog
+ dh_installdocs
+ dh_installexamples
+# dh_install
+# dh_installmenu
+# dh_installdebconf
+# dh_installlogrotate
+# dh_installemacsen
+# dh_installpam
+# dh_installmime
+# dh_python
+# dh_installinit
+# dh_installcron
+# dh_installinfo
+ dh_installman
+ dh_link
+ dh_strip
+ dh_compress
+ dh_fixperms
+# dh_perl
+# dh_makeshlibs
+ dh_installdeb
+ dh_shlibdeps
+ dh_gencontrol
+ dh_md5sums
+ dh_builddeb
+
+binary: binary-indep binary-arch
+.PHONY: build clean binary-indep binary-arch binary install configure
+""" % packageContents
+ open(os.path.join(DEBIAN, "rules"), "w").write(txt)
+ os.chmod(os.path.join(DEBIAN, "rules"), 0755)
+
+ ###########################################################################
+ ###########################################################################
+ ###########################################################################
+
+ generatedFiles = []
+
+ if build:
+ #http://www.debian.org/doc/manuals/maint-guide/ch-build.fr.html
+ ret = os.system('cd "%(DEST)s"; dpkg-buildpackage -tc -rfakeroot -us -uc' % packageContents)
+ if ret != 0:
+ raise Py2debException("buildpackage failed (see output)")
+
+ l=glob("%(TEMP)s/%(name)s*.deb" % packageContents)
+ if len(l) != 1:
+ raise Py2debException("didn't find builded deb")
+
+ tdeb = l[0]
+ deb = os.path.basename(tdeb)
+ shutil.move(tdeb, deb)
+
+ generatedFiles = [deb, ]
+
+ if rpm:
+ rpmFilename = deb2rpm(deb)
+ generatedFiles.append(rpmFilename)
+
+ if src:
+ tarFilename = py2src(TEMP, name)
+ generatedFiles.append(tarFilename)
+
+ if tar:
+ tarFilename = py2tar(DEST, TEMP, name, version)
+ generatedFiles.append(tarFilename)
+
+ if dsc:
+ dscFilename = py2dsc(TEMP, name, version, depends, author, mail, arch)
+ generatedFiles.append(dscFilename)
+
+ if changes:
+ changesFilenames = py2changes(packageContents)
+ generatedFiles.extend(changesFilenames)
+
+ return generatedFiles
+
+ #~ except Exception,m:
+ #~ raise Py2debException("build error :"+str(m))
+
+ finally:
+ if Py2deb.clear:
+ shutil.rmtree(TEMP)
+
+
+if __name__ == "__main__":
+ try:
+ os.chdir(os.path.dirname(sys.argv[0]))
+ except:
+ pass
+
+ p=Py2deb("python-py2deb")
+ p.description="Generate simple deb(/rpm/tgz) from python (2.4, 2.5 and 2.6)"
+ p.url = "http://www.manatlan.com/page/py2deb"
+ p.author=__author__
+ p.mail=__mail__
+ p.depends = "dpkg-dev, fakeroot, alien, python"
+ p.section="python"
+ p["/usr/lib/python2.6/dist-packages"] = ["py2deb.py", ]
+ p["/usr/lib/python2.5/site-packages"] = ["py2deb.py", ]
+ p["/usr/lib/python2.4/site-packages"] = ["py2deb.py", ]
+ #~ p.postinstall = "s.py"
+ #~ p.preinstall = "s.py"
+ #~ p.postremove = "s.py"
+ #~ p.preremove = "s.py"
+ print p
+ print p.generate(__version__, changelog = __doc__, src=True)
--- /dev/null
+# lint Python modules using external checkers.
+#
+# This is the main checker controling the other ones and the reports
+# generation. It is itself both a raw checker and an astng checker in order
+# to:
+# * handle message activation / deactivation at the module level
+# * handle some basic but necessary stats'data (number of classes, methods...)
+#
+[MASTER]
+
+# Specify a configuration file.
+#rcfile=
+
+# Python code to execute, usually for sys.path manipulation such as
+# pygtk.require().
+#init-hook=
+
+# Profiled execution.
+profile=no
+
+# Add <file or directory> to the black list. It should be a base name, not a
+# path. You may set this option multiple times.
+ignore=CVS
+
+# Pickle collected data for later comparisons.
+persistent=yes
+
+# Set the cache size for astng objects.
+cache-size=500
+
+# List of plugins (as comma separated values of python modules names) to load,
+# usually to register additional checkers.
+load-plugins=
+
+
+[MESSAGES CONTROL]
+
+# Enable only checker(s) with the given id(s). This option conflicts with the
+# disable-checker option
+#enable-checker=
+
+# Enable all checker(s) except those with the given id(s). This option
+# conflicts with the enable-checker option
+#disable-checker=
+
+# Enable all messages in the listed categories.
+#enable-msg-cat=
+
+# Disable all messages in the listed categories.
+#disable-msg-cat=
+
+# Enable the message(s) with the given id(s).
+#enable-msg=
+
+# Disable the message(s) with the given id(s).
+disable-msg=W0403,W0612,W0613,C0103,C0111,C0301,R0903,W0142,W0603,R0904,R0921,R0201
+
+[REPORTS]
+
+# set the output format. Available formats are text, parseable, colorized, msvs
+# (visual studio) and html
+output-format=colorized
+
+# Include message's id in output
+include-ids=yes
+
+# Put messages in a separate file for each module / package specified on the
+# command line instead of printing them on stdout. Reports (if any) will be
+# written in a file name "pylint_global.[txt|html]".
+files-output=no
+
+# Tells wether to display a full report or only the messages
+reports=no
+
+# Python expression which should return a note less than 10 (10 is the highest
+# note).You have access to the variables errors warning, statement which
+# respectivly contain the number of errors / warnings messages and the total
+# number of statements analyzed. This is used by the global evaluation report
+# (R0004).
+evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
+
+# Add a comment according to your evaluation note. This is used by the global
+# evaluation report (R0004).
+comment=no
+
+# Enable the report(s) with the given id(s).
+#enable-report=
+
+# Disable the report(s) with the given id(s).
+#disable-report=
+
+
+# checks for
+# * unused variables / imports
+# * undefined variables
+# * redefinition of variable from builtins or from an outer scope
+# * use of variable before assigment
+#
+[VARIABLES]
+
+# Tells wether we should check for unused import in __init__ files.
+init-import=no
+
+# A regular expression matching names used for dummy variables (i.e. not used).
+dummy-variables-rgx=_|dummy
+
+# List of additional names supposed to be defined in builtins. Remember that
+# you should avoid to define new builtins when possible.
+additional-builtins=
+
+
+# checks for :
+# * doc strings
+# * modules / classes / functions / methods / arguments / variables name
+# * number of arguments, local variables, branchs, returns and statements in
+# functions, methods
+# * required module attributes
+# * dangerous default values as arguments
+# * redefinition of function / method / class
+# * uses of the global statement
+#
+[BASIC]
+
+# Required attributes for module, separated by a comma
+required-attributes=
+
+# Regular expression which should only match functions or classes name which do
+# not require a docstring
+no-docstring-rgx=__.*__
+
+# Regular expression which should only match correct module names
+module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
+
+# Regular expression which should only match correct module level names
+const-rgx=(([A-Z_][A-Z1-9_]*)|(__.*__))$
+
+# Regular expression which should only match correct class names
+class-rgx=[A-Z_][a-zA-Z0-9]+$
+
+# Regular expression which should only match correct function names
+function-rgx=[a-z_][a-z0-9_]{2,30}$
+
+# Regular expression which should only match correct method names
+method-rgx=[a-z_][a-z0-9_]{2,30}$
+
+# Regular expression which should only match correct instance attribute names
+attr-rgx=[a-z_][a-zA-Z0-9_]{2,30}$
+
+# Regular expression which should only match correct argument names
+argument-rgx=[a-z_][a-zA-Z0-9_]{2,30}$
+
+# Regular expression which should only match correct variable names
+variable-rgx=[a-z_][a-zA-Z0-9_]{2,30}$
+
+# Regular expression which should only match correct list comprehension /
+# generator expression variable names
+inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
+
+# Good variable names which should always be accepted, separated by a comma
+good-names=i,j,k,ex,Run,_
+
+# Bad variable names which should always be refused, separated by a comma
+bad-names=foo,bar,baz,toto,tutu,tata
+
+# List of builtins function names that should not be used, separated by a comma
+bad-functions=map,filter,apply,input
+
+
+# try to find bugs in the code using type inference
+#
+[TYPECHECK]
+
+# Tells wether missing members accessed in mixin class should be ignored. A
+# mixin class is detected if its name ends with "mixin" (case insensitive).
+ignore-mixin-members=yes
+
+# When zope mode is activated, consider the acquired-members option to ignore
+# access to some undefined attributes.
+zope=no
+
+# List of members which are usually get through zope's acquisition mecanism and
+# so shouldn't trigger E0201 when accessed (need zope=yes to be considered).
+acquired-members=REQUEST,acl_users,aq_parent
+
+
+# checks for sign of poor/misdesign:
+# * number of methods, attributes, local variables...
+# * size, complexity of functions, methods
+#
+[DESIGN]
+
+# Maximum number of arguments for function / method
+max-args=5
+
+# Maximum number of locals for function / method body
+max-locals=15
+
+# Maximum number of return / yield for function / method body
+max-returns=6
+
+# Maximum number of branch for function / method body
+max-branchs=12
+
+# Maximum number of statements in function / method body
+max-statements=50
+
+# Maximum number of parents for a class (see R0901).
+max-parents=7
+
+# Maximum number of attributes for a class (see R0902).
+max-attributes=15
+
+# Minimum number of public methods for a class (see R0903).
+min-public-methods=1
+
+# Maximum number of public methods for a class (see R0904).
+max-public-methods=20
+
+
+# checks for :
+# * methods without self as first argument
+# * overridden methods signature
+# * access only to existant members via self
+# * attributes not defined in the __init__ method
+# * supported interfaces implementation
+# * unreachable code
+#
+[CLASSES]
+
+# List of interface methods to ignore, separated by a comma. This is used for
+# instance to not check methods defines in Zope's Interface base class.
+ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by
+
+# List of method names used to declare (i.e. assign) instance attributes.
+defining-attr-methods=__init__,__new__,setUp
+
+
+# checks for
+# * external modules dependencies
+# * relative / wildcard imports
+# * cyclic imports
+# * uses of deprecated modules
+#
+[IMPORTS]
+
+# Deprecated modules which should not be used, separated by a comma
+deprecated-modules=regsub,string,TERMIOS,Bastion,rexec
+
+# Create a graph of every (i.e. internal and external) dependencies in the
+# given file (report R0402 must not be disabled)
+import-graph=
+
+# Create a graph of external dependencies in the given file (report R0402 must
+# not be disabled)
+ext-import-graph=
+
+# Create a graph of internal dependencies in the given file (report R0402 must
+# not be disabled)
+int-import-graph=
+
+
+# checks for similarities and duplicated code. This computation may be
+# memory / CPU intensive, so you should disable it if you experiments some
+# problems.
+#
+[SIMILARITIES]
+
+# Minimum lines number of a similarity.
+min-similarity-lines=4
+
+# Ignore comments when computing similarities.
+ignore-comments=yes
+
+# Ignore docstrings when computing similarities.
+ignore-docstrings=yes
+
+
+# checks for:
+# * warning notes in the code like FIXME, XXX
+# * PEP 263: source code with non ascii character but no encoding declaration
+#
+[MISCELLANEOUS]
+
+# List of note tags to take in consideration, separated by a comma.
+notes=FIXME,XXX,TODO
+
+
+# checks for :
+# * unauthorized constructions
+# * strict indentation
+# * line length
+# * use of <> instead of !=
+#
+[FORMAT]
+
+# Maximum number of characters on a single line.
+# @note Limiting this to the most extreme cases
+max-line-length=100
+
+# Maximum number of lines in a module
+max-module-lines=1000
+
+# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
+# tab).
+indent-string='\t'
--- /dev/null
+#!/usr/bin/env python
+
+import commands
+
+
+verbose = False
+
+
+def syntax_test(file):
+ commandTemplate = """
+ python -t -t -W all -c "import py_compile; py_compile.compile ('%(filename)s', doraise=False)" """
+ compileCommand = commandTemplate % {"filename": file}
+ (status, text) = commands.getstatusoutput (compileCommand)
+ text = text.rstrip()
+ passed = len(text) == 0
+
+ if passed:
+ output = ("Syntax is correct for "+file) if verbose else ""
+ else:
+ output = ("Syntax is invalid for %s\n" % file) if verbose else ""
+ output += text
+ return (passed, output)
+
+
+if __name__ == "__main__":
+ import sys
+ import os
+ import optparse
+
+ opar = optparse.OptionParser()
+ opar.add_option("-v", "--verbose", dest="verbose", help="Toggle verbosity", action="store_true", default=False)
+ options, args = opar.parse_args(sys.argv[1:])
+ verbose = options.verbose
+
+ completeOutput = []
+ allPassed = True
+ for filename in args:
+ passed, output = syntax_test(filename)
+ if not passed:
+ allPassed = False
+ if output.strip():
+ completeOutput.append(output)
+ print "\n".join(completeOutput)
+
+ sys.exit(0 if allPassed else 1);
--- /dev/null
+[ConnectionManager]
+Name = theonering
+BusName = org.freedesktop.Telepathy.ConnectionManager.theonering
+ObjectPath = /org/freedesktop/Telepathy/ConnectionManager/theonering
+
+[Protocol GoogleVoice]
+param-username = s required
+param-password = s required secret
+param-forward = s required
--- /dev/null
+#!/usr/bin/env python
+
+from __future__ import with_statement
+import itertools
+
+
+verbose = False
+
+
+def tag_parser(file, tag):
+ """
+ >>> nothing = []
+ >>> for todo in tag_parser(nothing, "@todo"):
+ ... print todo
+ ...
+ >>> one = ["@todo Help!"]
+ >>> for todo in tag_parser(one, "@todo"):
+ ... print todo
+ ...
+ 1: @todo Help!
+ >>> mixed = ["one", "@todo two", "three"]
+ >>> for todo in tag_parser(mixed, "@todo"):
+ ... print todo
+ ...
+ 2: @todo two
+ >>> embedded = ["one @todo two", "three"]
+ >>> for todo in tag_parser(embedded, "@todo"):
+ ... print todo
+ ...
+ 1: @todo two
+ >>> continuation = ["one", "@todo two", " three"]
+ >>> for todo in tag_parser(continuation, "@todo"):
+ ... print todo
+ ...
+ 2: @todo two three
+ >>> series = ["one", "@todo two", "@todo three"]
+ >>> for todo in tag_parser(series, "@todo"):
+ ... print todo
+ ...
+ 2: @todo two
+ 3: @todo three
+ """
+ currentTodo = []
+ prefix = None
+ for lineNumber, line in enumerate(file):
+ column = line.find(tag)
+ if column != -1:
+ if currentTodo:
+ yield "\n".join (currentTodo)
+ prefix = line[0:column]
+ currentTodo = ["%d: %s" % (lineNumber+1, line[column:].strip())]
+ elif prefix is not None and len(prefix)+1 < len(line) and line.startswith(prefix) and line[len(prefix)].isspace():
+ currentTodo.append (line[len(prefix):].rstrip())
+ elif currentTodo:
+ yield "\n".join (currentTodo)
+ currentTodo = []
+ prefix = None
+ if currentTodo:
+ yield "\n".join (currentTodo)
+
+
+def tag_finder(filename, tag):
+ todoList = []
+
+ with open(filename) as file:
+ body = "\n".join (tag_parser(file, tag))
+ passed = not body
+ if passed:
+ output = "No %s's for %s" % (tag, filename) if verbose else ""
+ else:
+ header = "%s's for %s:\n" % (tag, filename) if verbose else ""
+ output = header + body
+ output += "\n" if verbose else ""
+
+ return (passed, output)
+
+
+if __name__ == "__main__":
+ import sys
+ import os
+ import optparse
+
+ opar = optparse.OptionParser()
+ opar.add_option("-v", "--verbose", dest="verbose", help="Toggle verbosity", action="store_true", default=False)
+ options, args = opar.parse_args(sys.argv[1:])
+ verbose = options.verbose
+
+ bugsAsError = True
+ todosAsError = False
+
+ completeOutput = []
+ allPassed = True
+ for filename in args:
+ bugPassed, bugOutput = tag_finder(filename, "@bug")
+ todoPassed, todoOutput = tag_finder(filename, "@todo")
+ output = "\n".join ([bugOutput, todoOutput])
+ if (not bugPassed and bugsAsError) or (not todoPassed and todosAsError):
+ allPassed = False
+ output = output.strip()
+ if output:
+ completeOutput.append(filename+":\n"+output+"\n\n")
+ print "\n".join(completeOutput)
+
+ sys.exit(0 if allPassed else 1);
--- /dev/null
+#!/usr/bin/env python
+
+import os
+import urllib
+import urllib2
+import traceback
+import warnings
+
+import sys
+sys.path.append("../../src")
+
+import browser_emu
+import gv_backend
+
+# Create Browser
+browser = browser_emu.MozillaEmulator(1)
+cookieFile = os.path.join(".", ".gv_cookies.txt")
+browser.cookies.filename = cookieFile
+
+# Login
+username = sys.argv[1]
+password = sys.argv[2]
+
+loginPostData = urllib.urlencode({
+ 'Email' : username,
+ 'Passwd' : password,
+ 'service': "grandcentral",
+ "ltmpl": "mobile",
+ "btmpl": "mobile",
+ "PersistentCookie": "yes",
+})
+
+try:
+ loginSuccessOrFailurePage = browser.download(gv_backend.GVDialer._loginURL, loginPostData)
+except urllib2.URLError, e:
+ warnings.warn(traceback.format_exc())
+ raise RuntimeError("%s is not accesible" % gv_backend.GVDialer._loginURL)
+
+forwardPage = browser.download(gv_backend.GVDialer._forwardURL)
+
+tokenGroup = gv_backend.GVDialer._tokenRe.search(forwardPage)
+if tokenGroup is None:
+ print forwardPage
+ raise RuntimeError("Could not extract authentication token from GoogleVoice")
+token = tokenGroup.group(1)
+
+
+with open("cookies.txt", "w") as f:
+ f.writelines(
+ "%s: %s\n" % (c.name, c.value)
+ for c in browser.cookies
+ )
--- /dev/null
+#!/usr/bin/env python
+
+import os
+import urllib
+import urllib2
+import traceback
+import warnings
+
+import sys
+sys.path.append("../../src")
+
+import browser_emu
+import gv_backend
+
+webpages = [
+ ("login", gv_backend.GVDialer._loginURL),
+ ("contacts", gv_backend.GVDialer._contactsURL),
+ ("voicemail", gv_backend.GVDialer._voicemailURL),
+ ("sms", gv_backend.GVDialer._smsURL),
+ ("forward", gv_backend.GVDialer._forwardURL),
+ ("recent", gv_backend.GVDialer._recentCallsURL),
+ ("placed", gv_backend.GVDialer._placedCallsURL),
+ ("recieved", gv_backend.GVDialer._receivedCallsURL),
+ ("missed", gv_backend.GVDialer._missedCallsURL),
+]
+
+
+# Create Browser
+browser = browser_emu.MozillaEmulator(1)
+cookieFile = os.path.join(".", ".gv_cookies.txt")
+browser.cookies.filename = cookieFile
+
+# Get Pages
+for name, url in webpages:
+ try:
+ page = browser.download(url)
+ except StandardError, e:
+ print e.message
+ continue
+ with open("not_loggedin_%s.txt" % name, "w") as f:
+ f.write(page)
+
+# Login
+username = sys.argv[1]
+password = sys.argv[2]
+
+loginPostData = urllib.urlencode({
+ 'Email' : username,
+ 'Passwd' : password,
+ 'service': "grandcentral",
+ "ltmpl": "mobile",
+ "btmpl": "mobile",
+ "PersistentCookie": "yes",
+})
+
+try:
+ loginSuccessOrFailurePage = browser.download(gv_backend.GVDialer._loginURL, loginPostData)
+except urllib2.URLError, e:
+ warnings.warn(traceback.format_exc())
+ raise RuntimeError("%s is not accesible" % gv_backend.GVDialer._loginURL)
+with open("loggingin.txt", "w") as f:
+ f.write(page)
+
+forwardPage = browser.download(gv_backend.GVDialer._forwardURL)
+
+tokenGroup = gv_backend.GVDialer._tokenRe.search(forwardPage)
+if tokenGroup is None:
+ print forwardPage
+ raise RuntimeError("Could not extract authentication token from GoogleVoice")
+token = tokenGroup.group(1)
+
+# Get Pages
+for name, url in webpages:
+ try:
+ page = browser.download(url)
+ except StandardError, e:
+ warnings.warn(traceback.format_exc())
+ continue
+ print "Writing to file"
+ with open("loggedin_%s.txt" % name, "w") as f:
+ f.write(page)
--- /dev/null
+from __future__ import with_statement
+
+import logging
+
+import sys
+sys.path.append("../src")
+
+import util.coroutines as coroutines
+
+import gvoice
+
+
+logging.basicConfig(level=logging.DEBUG)
+
+
+class MockBackend(object):
+
+ def __init__(self, contactsData):
+ self.contactsData = contactsData
+
+ def get_contacts(self):
+ return (
+ (i, contactData["name"])
+ for (i, contactData) in enumerate(self.contactsData)
+ )
+
+ def get_contact_details(self, contactId):
+ return self.contactsData[contactId]["details"]
+
+
+def generate_update_callback(callbackData):
+
+ @coroutines.func_sink
+ @coroutines.expand_positional
+ def callback(book, addedContacts, removedContacts, changedContacts):
+ callbackData.append((book, addedContacts, removedContacts, changedContacts))
+
+ return callback
+
+
+def test_no_contacts():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([])
+ book = gvoice.addressbook.Addressbook(backend)
+ book.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ contacts = list(book.get_contacts())
+ assert len(contacts) == 0
+
+
+def test_one_contact_no_details():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([
+ {
+ "name": "One",
+ "details": [],
+ },
+ ])
+ book = gvoice.addressbook.Addressbook(backend)
+ book.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ contacts = list(book.get_contacts())
+ assert len(contacts) == 1
+ id = contacts[0]
+ name = book.get_contact_name(id)
+ assert name == backend.contactsData[id]["name"]
+
+ book.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ contacts = list(book.get_contacts())
+ assert len(contacts) == 1
+ id = contacts[0]
+ name = book.get_contact_name(id)
+ assert name == backend.contactsData[id]["name"]
+
+ contactDetails = list(book.get_contact_details(id))
+ assert len(contactDetails) == 0
+
+
+def test_one_contact_with_details():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([
+ {
+ "name": "One",
+ "details": [("Type A", "123"), ("Type B", "456"), ("Type C", "789")],
+ },
+ ])
+ book = gvoice.addressbook.Addressbook(backend)
+ book.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ contacts = list(book.get_contacts())
+ assert len(contacts) == 1
+ id = contacts[0]
+ name = book.get_contact_name(id)
+ assert name == backend.contactsData[id]["name"]
+
+ book.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ contacts = list(book.get_contacts())
+ assert len(contacts) == 1
+ id = contacts[0]
+ name = book.get_contact_name(id)
+ assert name == backend.contactsData[id]["name"]
+
+ contactDetails = list(book.get_contact_details(id))
+ print "%r" % contactDetails
+ assert len(contactDetails) == 3
+ assert contactDetails[0][0] == "Type A"
+ assert contactDetails[0][1] == "123"
+ assert contactDetails[1][0] == "Type B"
+ assert contactDetails[1][1] == "456"
+ assert contactDetails[2][0] == "Type C"
+ assert contactDetails[2][1] == "789"
+
+
+def test_adding_a_contact():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([
+ {
+ "name": "One",
+ "details": [],
+ },
+ ])
+ book = gvoice.addressbook.Addressbook(backend)
+ book.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ backend.contactsData.append({
+ "name": "Two",
+ "details": [],
+ })
+
+ book.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update(force=True)
+ assert len(callbackData) == 1, "%r" % callbackData
+
+ callbackBook, addedContacts, removedContacts, changedContacts = callbackData[0]
+ assert callbackBook is book
+ assert len(addedContacts) == 1
+ assert 1 in addedContacts
+ assert len(removedContacts) == 0
+ assert len(changedContacts) == 0
+
+
+def test_removing_a_contact():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([
+ {
+ "name": "One",
+ "details": [],
+ },
+ ])
+ book = gvoice.addressbook.Addressbook(backend)
+ book.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ del backend.contactsData[:]
+
+ book.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ book.update(force=True)
+ assert len(callbackData) == 1, "%r" % callbackData
+
+ callbackBook, addedContacts, removedContacts, changedContacts = callbackData[0]
+ assert callbackBook is book
+ assert len(addedContacts) == 0
+ assert len(removedContacts) == 1
+ assert 0 in removedContacts
+ assert len(changedContacts) == 0
--- /dev/null
+from __future__ import with_statement
+
+import datetime
+import logging
+
+import sys
+sys.path.append("../src")
+
+import util.coroutines as coroutines
+
+import gvoice
+
+
+logging.basicConfig(level=logging.DEBUG)
+
+
+class MockBackend(object):
+
+ def __init__(self, conversationsData):
+ self.conversationsData = conversationsData
+
+ def get_messages(self):
+ return self.conversationsData
+
+
+def generate_update_callback(callbackData):
+
+ @coroutines.func_sink
+ @coroutines.expand_positional
+ def callback(conversations, updatedIds):
+ callbackData.append((conversations, updatedIds))
+
+ return callback
+
+
+def test_no_conversations():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([])
+ conversings = gvoice.conversations.Conversations(backend)
+ conversings.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ conversings.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ conversings.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ contacts = list(conversings.get_conversations())
+ assert len(contacts) == 0
+
+
+def test_a_conversation():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([
+ {
+ "id": "conv1",
+ "contactId": "con1",
+ "name": "Con Man",
+ "time": datetime.datetime(2000, 1, 1),
+ "relTime": "Sometime back",
+ "prettyNumber": "(555) 555-1224",
+ "number": "5555551224",
+ "location": "",
+ "messageParts": [
+ ("Innocent Man", "Body of Message", "Forever ago")
+ ],
+ },
+ ])
+ conversings = gvoice.conversations.Conversations(backend)
+ conversings.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ cons = list(conversings.get_conversations())
+ assert len(cons) == 1
+ assert cons[0] == ("con1", "5555551224"), cons
+
+ conversings.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ conversings.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+
+def test_adding_a_conversation():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([
+ {
+ "id": "conv1",
+ "contactId": "con1",
+ "name": "Con Man",
+ "time": datetime.datetime(2000, 1, 1),
+ "relTime": "Sometime back",
+ "prettyNumber": "(555) 555-1224",
+ "number": "5555551224",
+ "location": "",
+ "messageParts": [
+ ("Innocent Man", "Body of Message", "Forever ago")
+ ],
+ },
+ ])
+ conversings = gvoice.conversations.Conversations(backend)
+ conversings.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ cons = list(conversings.get_conversations())
+ assert len(cons) == 1
+ assert cons[0] == ("con1", "5555551224"), cons
+
+ conversings.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ conversings.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ backend.conversationsData.append(
+ {
+ "id": "conv2",
+ "contactId": "con2",
+ "name": "Pretty Man",
+ "time": datetime.datetime(2003, 1, 1),
+ "relTime": "Somewhere over the rainbow",
+ "prettyNumber": "(555) 555-2244",
+ "number": "5555552244",
+ "location": "",
+ "messageParts": [
+ ("Con Man", "Body of Message somewhere", "Maybe")
+ ],
+ },
+ )
+
+ conversings.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ conversings.update(force=True)
+ assert len(callbackData) == 1, "%r" % callbackData
+ idsOnly = callbackData[0][1]
+ assert ("con2", "5555552244") in idsOnly, idsOnly
+
+ cons = list(conversings.get_conversations())
+ assert len(cons) == 2
+ assert ("con1", "5555551224") in cons, cons
+ assert ("con2", "5555552244") in cons, cons
+
+
+def test_merging_a_conversation():
+ callbackData = []
+ callback = generate_update_callback(callbackData)
+
+ backend = MockBackend([
+ {
+ "id": "conv1",
+ "contactId": "con1",
+ "name": "Con Man",
+ "time": datetime.datetime(2000, 1, 1),
+ "relTime": "Sometime back",
+ "prettyNumber": "(555) 555-1224",
+ "number": "5555551224",
+ "location": "",
+ "messageParts": [
+ ("Innocent Man", "Body of Message", "Forever ago")
+ ],
+ },
+ ])
+ conversings = gvoice.conversations.Conversations(backend)
+ conversings.updateSignalHandler.register_sink(callback)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ cons = list(conversings.get_conversations())
+ assert len(cons) == 1
+ assert cons[0] == ("con1", "5555551224"), cons
+
+ conversings.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ conversings.update(force=True)
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ backend.conversationsData.append(
+ {
+ "id": "conv1",
+ "contactId": "con1",
+ "name": "Con Man",
+ "time": datetime.datetime(2003, 1, 1),
+ "relTime": "Sometime back",
+ "prettyNumber": "(555) 555-1224",
+ "number": "5555551224",
+ "location": "",
+ "messageParts": [
+ ("Innocent Man", "Mwahahaah", "somewhat closer")
+ ],
+ },
+ )
+
+ conversings.update()
+ assert len(callbackData) == 0, "%r" % callbackData
+
+ conversings.update(force=True)
+ assert len(callbackData) == 1, "%r" % callbackData
+ idsOnly = callbackData[0][1]
+ assert ("con1", "5555551224") in idsOnly, idsOnly
+ convseration = conversings.get_conversation(idsOnly.pop())
+ assert len(convseration["messageParts"]) == 2, convseration["messageParts"]
--- /dev/null
+from __future__ import with_statement
+
+import cookielib
+import logging
+
+import test_utils
+
+import sys
+sys.path.append("../src")
+
+import gvoice
+
+
+logging.basicConfig(level=logging.DEBUG)
+
+
+def generate_mock(cookiesSucceed, username, password):
+
+ class MockModule(object):
+
+ class MozillaEmulator(object):
+
+ def __init__(self, trycount = 1):
+ self.cookies = cookielib.LWPCookieJar()
+ self.trycount = trycount
+
+ def download(self, url,
+ postdata = None, extraheaders = None, forbid_redirect = False,
+ trycount = None, only_head = False,
+ ):
+ return ""
+
+ return MockModule
+
+
+def test_not_logged_in():
+ correctUsername, correctPassword = "", ""
+ MockBrowserModule = generate_mock(False, correctUsername, correctPassword)
+ gvoice.backend.browser_emu, RealBrowser = MockBrowserModule, gvoice.backend.browser_emu
+ try:
+ backend = gvoice.backend.GVoiceBackend()
+ assert not backend.is_authed()
+ assert not backend.login("bad_name", "bad_password")
+ backend.logout()
+ with test_utils.expected(RuntimeError):
+ backend.dial("5551234567")
+ with test_utils.expected(RuntimeError):
+ backend.send_sms("5551234567", "Hello World")
+ assert backend.get_account_number() == "", "%s" % backend.get_account_number()
+ gvoice.backend.set_sane_callback(backend)
+ assert backend.get_callback_number() == ""
+ with test_utils.expected(Exception):
+ recent = list(backend.get_recent())
+ with test_utils.expected(Exception):
+ messages = list(backend.get_messages())
+ finally:
+ gvoice.backend.browser_emu = RealBrowser
--- /dev/null
+#!/usr/bin/env python
+
+
+from __future__ import with_statement
+
+import inspect
+import contextlib
+import functools
+
+
+def TODO(func):
+ """
+ unittest test method decorator that ignores
+ exceptions raised by test
+
+ Used to annotate test methods for code that may
+ not be written yet. Ignores failures in the
+ annotated test method; fails if the text
+ unexpectedly succeeds.
+ !author http://kbyanc.blogspot.com/2007/06/pythons-unittest-module-aint-that-bad.html
+
+ Example:
+ >>> import unittest
+ >>> class ExampleTestCase(unittest.TestCase):
+ ... @TODO
+ ... def testToDo(self):
+ ... MyModule.DoesNotExistYet('boo')
+ ...
+ """
+
+ @functools.wraps(func)
+ def wrapper(*args, **kw):
+ try:
+ func(*args, **kw)
+ succeeded = True
+ except:
+ succeeded = False
+ assert succeeded is False, \
+ "%s marked TODO but passed" % func.__name__
+ return wrapper
+
+
+def PlatformSpecific(platformList):
+ """
+ unittest test method decorator that only
+ runs test method if os.name is in the
+ given list of platforms
+ !author http://kbyanc.blogspot.com/2007/06/pythons-unittest-module-aint-that-bad.html
+ Example:
+ >>> import unittest
+ >>> class ExampleTestCase(unittest.TestCase):
+ ... @PlatformSpecific(('mac', ))
+ ... def testMacOnly(self):
+ ... MyModule.SomeMacSpecificFunction()
+ ...
+ """
+
+ def decorator(func):
+ import os
+
+ @functools.wraps(func)
+ def wrapper(*args, **kw):
+ if os.name in platformList:
+ return func(*args, **kw)
+ return wrapper
+ return decorator
+
+
+def CheckReferences(func):
+ """
+ !author http://kbyanc.blogspot.com/2007/06/pythons-unittest-module-aint-that-bad.html
+ """
+
+ @functools.wraps(func)
+ def wrapper(*args, **kw):
+ refCounts = []
+ for i in range(5):
+ func(*args, **kw)
+ refCounts.append(XXXGetRefCount())
+ assert min(refCounts) != max(refCounts), "Reference counts changed - %r" % refCounts
+
+ return wrapper
+
+
+@contextlib.contextmanager
+def expected(exception):
+ """
+ >>> with expected2(ZeroDivisionError):
+ ... 1 / 0
+ >>> with expected2(AssertionError("expected ZeroDivisionError to have been thrown")):
+ ... with expected(ZeroDivisionError):
+ ... 1 / 2
+ Traceback (most recent call last):
+ File "/usr/lib/python2.5/doctest.py", line 1228, in __run
+ compileflags, 1) in test.globs
+ File "<doctest libraries.recipes.context.expected[1]>", line 3, in <module>
+ 1 / 2
+ File "/media/data/Personal/Development/bzr/Recollection-trunk/src/libraries/recipes/context.py", line 139, in __exit__
+ assert t is not None, ("expected {0:%s} to have been thrown" % (self._t.__name__))
+ AssertionError: expected {0:ZeroDivisionError} to have been thrown
+ >>> with expected2(Exception("foo")):
+ ... raise Exception("foo")
+ >>> with expected2(Exception("bar")):
+ ... with expected(Exception("foo")): # this won't catch it
+ ... raise Exception("bar")
+ ... assert False, "should not see me"
+ >>> with expected2(Exception("can specify")):
+ ... raise Exception("can specify prefixes")
+ >>> with expected2(Exception("Base class fun")):
+ True
+ >>> True
+ False
+ """
+ if isinstance(exception, Exception):
+ excType, excValue = type(exception), str(exception)
+ elif isinstance(exception, type):
+ excType, excValue = exception, ""
+
+ try:
+ yield
+ except Exception, e:
+ if not (excType in inspect.getmro(type(e)) and str(e).startswith(excValue)):
+ raise
+ else:
+ raise AssertionError("expected {0:%s} to have been thrown" % excType.__name__)
+
+
+if __name__ == "__main__":
+ import doctest
+ doctest.testmod()